Summary of consultation workshops on the development of an interactive tool for payment by results
This document summarises the key learning points from two workshops held on 18 January 2016 at the Cockpit Arts Centre in London. The workshops were attended by a total of 14 individuals, both commissioners and providers, with experience of PbR; they came from a range of different sectors including criminal justice, employment, health, homelessness and substance misuse.
Key learning about payment by results
The participants shared their learning from being involved in commissioning and delivering PbR schemes and identified the following key features:
PbR Dos
- Ensure that staff from all departments/agencies involved in service delivery are engaged in the process from the start and ensure a continuing dialogue.
- Forecast demand under different scenarios at an early stage.
- Reliability of data is critical.
- Ensure commissioners are clear about their objectives.
- Base outcomes on service user needs and realistic targets.
- Underpin outcomes with qualitative evidence.
- Limit the number of outcomes and ensure they are simple and measurable.
- Outcomes for complex service users may need to incorporate an understanding of “distance travelled”.
- Ensure that outcome measures are translated into structured targets for frontline staff.
- Be clear about the issues that providers have control over, and those which they don’t (e.g. many schemes are undermined by the lack of access to secure housing.)
- Expect unintended consequences.
PbR Don’ts
- Cash flow is a key issue; ensure contract does not exclude smaller organisations.
- Ensure that contracts are flexible and can adapt to real world changes and unexpected consequences.
- Providers may need to invest in their own development to transition from an output-focused culture to an outcome-oriented one.
- Ensure that there is an incentive for providers to continue service delivery beyond the achievement of initial targets.
- Do not depend on PbR to articulate the whole value of what can be delivered/achieved.
- Don’t assume that PbR necessarily gives more bang per buck.
- Providers should not bid for PbR contracts if their initial analysis suggests that the financial numbers do not add up
PbR priorities
- Honest, transparent and open dialogue between commissioners and service providers.
- Review and monitor progress regularly.
Designing the interactive tool
Participants were asked 5 key questions about the PbR tool:
- What would they like to gain from using such a tool?
- What issues would they like to see covered in the tool?
- Characteristics of the tool
- How long should the tool take to complete?
- Could they recommend other good quality tools from which to learn?
Purpose of using the tool
- Indicate the general direction of travel with links to relevant and detailed up-to-date resources.
- Encourage commissioners to be aware of provider issues and vice versa.
- Provide information and advice on key issues rather than a simple yes/no PbR is/is not appropriate.
- Highlight pros, cons, risks and unintended consequences.
- To facilitate better commissioning decisions.
- To provide information based on evidence which could be shared with senior managers/trustees.
- Transparent reasoning.
Issues covered by the tool
- Indication of additional costs such as the time involved in designing and validating outcome measures.
- Awareness of the impact on the wider market.
- Awareness of the impact on other local providers in other sectors.
- Whether commissioners require a tried-and-tested approach or innovation.
Characteristics
- Users should be able to revisit the tool, and amend answers to see how different decisions affect the viability of PbR.
- Users should be able to complete as much of the tool at one time as they wish while saving their work before returning to it at a later date.
- Clear and usable – no jargon!
- Additional guidance and best practice resources generated when you download answers
How long?
There was no consensus around this issue but it was felt that the tool needed to be flexible and generate different levels of learning so that users could investigate different issues to the extent that they were relevant to them.
Example of tools to learn from
- Impact assessment from B Corp
- Measuring Up: impact assessment from Inspiring Impact
- Social Finance questionnaire on social impact funding
- Institute for Government Public service markets diagnostic tool
Next steps
Russell will analyse the learning from these workshops with the feedback from the online PbR surveys (frontline staff and service users) in early February and start developing the first version of the tool in that month.
He hopes to have an initial version of the tool ready for piloting by March 2016.