Award Winner 2017: Manchester City Council – Troubled Families Evaluation

Summary

Manchester City Council has been tracking 5,000 families over the past 6 years to understand, inform and evidence the delivery of the National Troubled Families programme. The evidence produced in the latest update, provides some of the best evaluation in the country on the programme and goes broader and deeper evidence than the Government’s National Impact Study could when it was published last year.

The project has required data systems to be designed and implemented (links to our 2015 LARIA Award winning work on iBase), evaluation and statistical methods to be used, engagement / qualitative approaches used with services, and powerful narrative to influence Senior Managers, Local Politicians, Partners and Government Departments.

The evaluation has shown that by sustaining this new way of working, despite annual budget pressures and Ofsted inspection priorities, has delivered real benefits for families which are widespread and sustained. As a system we are now reaping the benefits of protecting the investments in TF approaches, and with the investment in Performance and Evaluation means that we now know much more about what works best with families, why, and also what works less well.

Wow factor

  1. Investing in evaluation is a critical enabler of PSR
  2. TF programme isn’t one single group, but rather a series of clusters placing demand on core services in different ways
  3. The Troubled Families programme within Manchester has, unlike some National Headlines, delivered statistically significant and sustained impacts

Synopsis

Manchester committed to understanding the impact of the National Troubled Families on families and services. This involved designing and implementing a performance and evaluation framework and the systems to support the data collection.

Data about 5,000 families containing over 15,000 people has been tracked and analysed over the past 4-years. The first 2-years of tracking and analysis predominately took the form of performance monitoring / management, which influenced programme strategy and delivery but didn’t answer the questions on causality and added value. The last 2-years has seen the data being turned into evaluation and insight, and directly directing policy, finance and operational decisions.

This most recent wave of the evaluation, which took place at the end of 2016, took assessment information from 4,000 families in the programme and identified the common presenting needs. With the scale of data now available and the understanding that has built up over the last 4-years, we’ve been able to demonstrate not only siloes of demand, but also the complex patterns of demand. For example, the data has shown for some time that around about two-thirds of families have someone with a Mental Health concern, what this latest wave of evaluation has enabled us to do is look at the correlated needs with mental health (i.e. 45% of families with a MH issues are also involved in ASB; and 70% have children with Safeguarding needs). This is generating a true whole system view of support needs rather than being dictated by current structures and services.

The evaluation also looked at the delivery side of the service, identifying how specifically assessment and allocation processes can affect family engagement and success. This part of the analysis has enabled more accurate business planning, where from the total number of expected referrals, we can derive how many are likely to translate into full assessments and cases. This means that a finer grain deployment of resources can be made, thinking about who does allocation/assessment and who does case delivery. We’ve also seen that the impact of the first 2-years of performance management saw average case duration reduce, and how this has been a positive move in terms of family outcomes.

The core part of the evaluation has been a series of description analyses, looking at impact on 20+ outcomes. This has shown that there has been a statistically significant improvement in sustained outcomes (i.e. over 12 month) when comparing the cohort of families at the start and end of their intervention. We’ve also been able to draw out the areas where the programme has been more or less successful.

The biggest development however has been around using clustering techniques to look at the impact of the programme through difference lens, rather that assuming a one size fits all perspective. What this has shown is that depending on the characteristics of the families, the programme can expect to have different levels of added value when compared to similar families in a Business as Usual or Statutory sense.

Lessons learned

We would like LARIA members to take two things away:

  1. As Researchers, Analysts, Data Scientists we need to question, challenge and engage with policy/decision makers. Ensuring that there is a commitment to appropriate long term performance and evaluation and an appetite to listen and respond to emerging findings.
  2. That the Troubled Families programme within Manchester has, unlike some National Headlines, delivered statistically significant and sustained impacts. This has been as much about the evaluation design as it is about actual service delivery. We always knew that at a case level services improved families lives, but the fact that we’ve invested in systems and analysis to support this work, and used appropriate techniques like cluster analysis to unpick what could be viewed as a blanket approach, means that we can now evidence it.

The messages from this evaluation have been delivered straight to our Senior Management Team, Full Council, Head of Finance, Operation Leads, alongside feeding back to the Troubled Families Unit nationally. All of which have endorsed it’s completeness and committed to responding to it.

Tags:

Leave a Reply