First results
When the MoJ announced the launch of its Data Lab in April 2013, I gave it a hearty welcome, with the proviso that the proof of the pudding would be in the eating.
Its function was quite clear.
The Lab is open to any organisation working with offenders which wants to evidence how effective their work is at reducing reoffending.
To use the service, organisations simply provide details of the offenders who they have worked with and information about the services they have provided.
The Lab then supplies aggregate one-year proven re-offending rates for that group, and, most importantly, that of a matched control group of similar offenders.
Well, last week the MoJ published the results of the Lab’s first six months’ operation – and the results are disappointing.
The pudding is very small and somewhat undercooked.
Activity
The purpose of the Justice Data Lab was to make it possible for small voluntary organisations to find out if their work with offenders made a difference to reoffending rates.
It was launched as part of the Transforming Rehabilitation project as a way of government, commissioners and Prime providers having a way of comparing the impact of different providers delivering a range of interventions.
Despite the strong publicity surrounding the launch of the Data Lab, with voluntary sector providers encouraged strongly to believe that participation would stand them in good stead to win business in the TR probation reforms, there were only 52 submissions in the first six months operation.
Seven of these have been fully answered.
Seven could not be answered because organisations were unable to provide the minimum criteria – (basically name, date of birth, gender, sentence type and date of sentence if possible).
The rest are still being processed.
Even with the seven requests which were answered, the MoJ struggled to match offenders in most cases.
Offenders were matched in 799 out of 3335 cases – just under a quarter of cases.
To be fair to the Data Lab, they have set the minimum data requirements very low and several organisations were unable to provide the basic information on a large proportion of the caseload.
In other cases, it’s not possible to tell whether it was the voluntary sector organisation or the MoJ whose data were not properly recorded.
Outcomes
But the biggest disappointment is that so far the Data Lab has been able to tell us very little about what works in reducing reoffending:
- Being offered short-term, full-time employment by Blue Sky reduces reoffending by between 1 and 23 percentage points
- Brighton’s Preventing Offender Accommodation Loss scheme reduces reoffending by between 1 and 38 percentage points
- There is insufficient evidence to assess the impact on reoffending of entering a Koestler Award
- There is insufficient evidence to assess the impact on reoffending of completing the Sycamore Tree Project
- There is insufficient evidence to assess the impact on reoffending of the Family Man course run by Safe Ground
- The reoffending rate for prisoners released from HMP Armley who saw the Shelter housing advice service was higher than the control group – but the control group wasn’t matched for homelessness/accommodation issues.
- There is insufficient evidence to assess the impact on reoffending of the Swansea Community Chaplaincy project
Conclusion
Well, it’s early days and I still applaud the commitment to transparency about reoffending rates.
Nevertheless, the next set of statistics, presumably out in April 2014, will need to be rather more informative for the Data Lab to keep running pasts its initial one year pilot.
If you’ve applied to the Justice Data Lab, please share your experiences below.
4 responses
Hi Russell
Agree with a lot of your points here, but while these findings only offer a limited assessment of what works it’s still a very significant step. Your initial post on the JDL outlined the difficulties charities face in understanding their impact. Now, with the JDL they can access very robust outcome data with a comparison to a counterfactual.
Perhaps the biggest step is that all this information is open for all to see, which makes it an important shift towards better evaluation and shared learning by the sector. Also, from a technical point of view it’s rare to see reports with up-front levels of confidence in estimating reoffending. We generally read that something is performing well with reports with confidence intervals buried at the back of a technical appendix or not reported at all. And we never see reports of what doesn’t work.
Of course, the matching rate could be better, but this should happen in time and practice. It’s one of many reasons why the JDL should be established as a long term investment rather than a pilot.
The JDL is never going to answer all our questions about what works. Rather it’s down to us to use its findings to develop our understanding. Only seven reports have been released so far, because it’s taken MoJ a while to get the format and process right. But even with these we can already start to ask some interesting questions about what makes Blue Sky/BHT effective? A new tranche of reports will be released monthly from now on, meaning that results about what works will accumulate.
Above all, the charities involved should be commended for being the first to show a willingness to test themselves.
Hi Tracey
Thanks very much for your comments which I think are very fair.
The commitment from charities to be measured and MoJ to publish transparent data should both be applauded.
I’m hoping that the next wave of reports will be more interesting – there must be examples where voluntary sector have accurate data for a few hundred offenders?
I think the main challenge will be the limitations on the ability of MoJ to match data on need, OASys is not up to the task and, anyway, is often not completed.
But it would be much more valuable if we could match drug interventions against matched cohorts of drug using offenders, housing interventions against homeless etc. etc.
Russell
Coincidentally I’ve been looking at the data lab stats and agree that so far it doesn’t show much promise. I’ve looked particularly at the Blue Sky data, which I’m afraid amounts to a statement of the obvious, I thought I would take a peek at what use Blue Sky were making of the findings on their website and you can see my conclusions in my blog http://maroscoe.wordpress.com/2013/10/16/spurious-statistics. I’m afraid the indications are that organisations are going to use the data to make inflated claims.
Thanks for your comment Martin.
Despite my somewhat negative review above, I really hope the Justice Data Lab persists and grows.
The main problems with most of the submissions to date is the small number of offenders in the cohort, I am sure that there are lots of voluntary sector organisations who have been working in the CJS for years who have much larger cohorts.
The analysis may be basic at the moment, but the encouragement to focus on outcomes in a transparent, comparable way must be a good thing…