Menu
doctor microscope
Russell Webster

Russell Webster

Criminal Justice & substance misuse expert and author of this blog.

Evaluating the impact of the Justice Data Lab

Share on twitter
Share on facebook
Share on linkedin
Share on print
Share on email
I can foresee a situation in the near future where the MoJ links its own funding to participation in the Justice Data Lab and puts pressure on other funders to do the same.

How is the Justice Data Lab doing?

New Philanthropy Capital, or NPC as it is now known, describes itself as a “charity think tank and consultancy” and was the prime architect of the Ministry of Justice’s Justice Data Lab.

The JDL was launched  in April 2013 with the purpose of trying to establish the effectiveness of a range of interventions designed to prevent re-offending. To use the service, organisations simply provide details of the offenders who they have worked with and information about the services they have provided. The Lab then supplies aggregate one-year proven re-offending rates for that group, and, most importantly,  that of a matched control group of similar offenders.

NPC has just (30 July 2015) published a review of the JDL to date, entitled “Under the microscope: Data, charities and working with offenders.”

[divider]

How effective is the JDL approach?

The JDL is the only way that non government organisations can access information about whether they have an impact on reoffending without employing professional researchers to negotiate access to Police National Computer data. It is not a perfect solution as NPC points out:

  • Reconviction rates do not tell us everything: the journey away from crime is long and complex and organisations can still contribute to it, sometimes significantly, without being able to show their impact in this kind of analysis. This is the nature of desistance.
  • Reporting an average re-offending rate for a group of ex-offenders undoubtedly hides a range of successes and failures. For example, projects shown to be effective for some may still be useless or even harmful for others.
  • The process of matching the control group cannot account for all factors, particularly when organisations are working with very difficult or complex individuals. For example, the Justice Data Lab is not really appropriate for organisations that target substance misusers because there is no variable on the PNC to match this sample with other substance misusers. Conversely, some organisations may get ‘false positives’ because their service users are less predisposed to re-offend in the first place.
  • The laws of statistical reliability mean that organisations that have worked with larger numbers of people are more likely to get a definitive result. The minimum number of service users organisations can submit to the Justice Data Lab is 60, but even at this level, the findings are most likely to be inconclusive.
  • The Justice Data Lab cannot answer more detailed questions such as why an intervention failed or worked, or the optimum type of level of intervention in different circumstances.

[divider]

What have we learnt so far?

The short answer is that we haven’t learnt as much as we had hoped. There have been 125 analyses conducted so far. 29 of these have shown that the service is associated with a reduction in offending, 89 have shown inconclusive results (mainly because samples have been too small), and 7 have shown that services are associated with an increase in re-offending. Where results are positive, reductions in the re-offending rate have been around 1 to 10 percentage points.

Given these limitations, NPC’s further analysis is on fairly thin ground. However, the report does provide an interesting breakdown of intervention by sector, “comparing” the impact of providers from the voluntary, private and public sectors as well as educational institutions:

NPC JDL sectorAs you can see, the public sector comes out best, followed by the voluntary sector, even though the sample size is very small. And that is the major worry; when NPC was consulting with voluntary sector organisations working with offenders as part of their work to develop the JDL, there was a lot of enthusiasm for the project. However, fewer than 40 charities have used the Lab. This appears to be for a number of different reasons including:

  • Concerns over data protection – some charities worried that they did not have offenders’ consent to share their information with the MoJ.
  • Some organisations did not have sufficient information on the people they work with (even though the JDL requirements are pretty minimal).
  • For a number of organisations, the JDL wasn’t able to match the the group of offenders they were working with accurately – if you’re working with people with very complex needs including mental health and substance misuse, it’s reasonable to measure your performance within the context of this very high risk (of reoffending) group.
  • My sense is that a number of organisations saw the early results and wondered if it was in their best interests to get involved in an initiative which might show that they were ineffective.

[divider]

Conclusions

This leaves the JDL in a tricky place; it has now been confirmed as a permanent resource designed to inform policy and funding around reoffending. However, if providers continue to vote with their feet and then submit their data, it will be of limited value. I can foresee a situation in the near future where the MoJ links its own funding to participation in the JDL and puts pressure on other funders to do the same.

 

Related posts you might like:

Share on twitter
Share on facebook
Share on linkedin
Share on print
Share on email

5 Responses

  1. An accurate and clear blog as usual Russell, I am a fan! When we started the Justice Data Lab project our initial assessment gathered from the sector was that there was a large demand for this service as access to re-offending data was previously incredibly difficult. In practice, as identified within the report, demand has been lower than anticipated, a core reason, we would agree, is that the risk of having your results published online for everyone to see has been to great, especially as there has been so much change and uncertainty caused by TR. There are also the issues of the JDL not being appropriate for all organisations, and for that, the MoJ has been working very hard to try to improve the metrics and matching for this service. See my blog about this [http://www.thinknpc.org/blog/justice-data-lab-now-a-permanent-service/]

    Within the report we focused on statistical significant results as they are intuitively easy to understand (and more newsworthy), but I think there is still important work to be done looking at reports which had an inconclusive result. Not all of these reports had small sample sizes, and certainly for some organisations, Safe Ground comes to mind, an inconclusive result was found for overall reoffending rates, but they did receive a statistically significant result for reductions in the frequency of offending. Knowing what potentially doesn’t work is just as important as knowing what does work, and I think there is more work to be done in exploring this.

    We’re hosting an event looking at the Justice Data Lab and other Data Labs which are on the way in the 22nd September. It would be great if you and others who are interested could join us [http://www.thinknpc.org/events/data-labs-opening-up-government-data/]

  2. Thanks for your comment, Tracey. I do agree that there is value in learning what doesn’t work as well as what does – sometimes, perhaps even more so!
    The move to transparency and a focus on outcomes has to be a positive, whatever the JDL’s teething problems.

  3. Totally agree with transparency and that we are all encouraged to produce results. We have only worked with 65 offenders but as we have access to Police, Prison and Probation systems we can accurately and confidently give our statistics.
    Currently we have a reoffending rate of 14% and this is having worked with 35% PPO’s!
    Just a shame we cannot produce our figures via the JDL.

Leave a Reply

Your email address will not be published.

Measuring social impact

Our cutting-edge approach to measurement and evaluation is underpinned by robust methods, rigorous analyses, and cost-effective data collection.

Proving Social Impact

Get the Data provides Social Impact Analytics to enable organisations to demonstrate their impact on society.

Select Language

Keep up-to-date on drugs and crime

You will get one email with a new article every day.