publication-thumbnail

Evidence in the commissioning process: Insights from focus groups with local authority commissioners (2019)

Published: 22/05/2019

Author: Godar R

Citation:

Download citation file To use these files, you need citation software installed on your device that can read .ris file format. RIS is a standardized tag format designed to allow the exchange of citation data between computer programs. The format is used on platforms such as EBSCOhost, Scopus, ProQuest, OVID to export citations to various reference/citation managers such as RefWorks, Mendeley, Endnote or Zotero.

Godar R. (2019). Evidence in the commissioning process: Insights from focus groups with local authority commissioners (2019). Dartington: Research in Practice.

Sections

1. Executive Summary

This report is the result of two telephone focus groups with commissioners of children’s services in English local authorities. Participants came from a range of backgrounds and were responsible for commissioning a wide range of services.

Commissioning in context

Commissioners are working in the context of austerity and of increasing financial pressures on commissioning budgets. This influences the use of evidence in a number of ways:

  • Evidence of effectiveness and cost-benefit can help to justify spending scarce resources
  • However, many interventions with strong evidence come at a high cost, and commissioners are looking to adapt interventions to fit the available resource
  • The drive for innovation and finding new solutions to save money or improve services is in tension with using interventions with strong evidence, but a broader view of evidence can inform logic models and theories of change, preventing innovation being ‘ a blind leap of faith’.

Types of evidence

Participants highlighted different types of evidence used at different point of the commissioning process:

  • Evidence of local need, derived from national and local data, surveys and qualitative research with young people and families and feedback from practitioners
  • Evidence about ‘what works’, or what might work here, derived from research produced by academics, voluntary organisations and government and informal intelligence from other local authorities
  • Evidence from the market about what is possible locally, given the skills and capacity of organisations, including the local authority itself, to provide the desired service.

Commissioners identified a tension between listening to children and young people locally and using evidence to inform service design. This tension was partly resolved by talking to young people about the outcomes that they wanted to achieve, and then looking for research evidence to find an appropriate way of achieving those outcomes. This helped commissioners to sift through evidence to find studies that provided evidence for those specific outcomes.

What makes evidence trustworthy, robust and useful?

In deciding what weight to give particular studies or findings, commissioners considered:

  • Who produces, publishes and reviews the evidence
  • The context in which the study was done, with studies in English local authorities with similar demographics being preferred
  • The methods used, with mixed method studies providing both hard data and user voice preferred to a single method study. Participants varied in their level of confidence that they could judge the methodological merits or pitfalls of a particular study.

Commissioners are not only looking for services or intervention with strong evidence, but are also looking for something that ‘fits’ with the local context and systems. Studies that provide detail about how the intervention is delivered and details of implementation allow commissioners to make more informed choices about local fit.

When making choices, commissioners considered:

  • What do we already have here? How will those services be affected by any new service?
  • What will the demand for this new service be? What capacity does the service need to meet that demand?
  • What capability do we have locally to deliver the service? Are there skills gaps that might be a barrier to recruiting the required professionals?

Accessing research

Commissioners both sought out evidence for particular purposes and tried to keep abreast of new research by signing up to information services and research repositories. When looking for evidence, participants reported making use of organisations that synthesise and quality assure evidence to ensure that the evidence they used was robust. This also helped to mitigate some of the difficulty with paywalls to access academic journals. Some authorities were trying to generate local evidence of effectiveness, by evaluating programmes already in place. Generating local evidence of a similar quality to formal academic evidence was felt to be challenging for a number of reasons:

  • Agreeing outcome measures and tools across services to get comparable results requires substantial negotiation and quality assurance
  • Evaluating services operating in a ‘real-world’ environment needs to be able to cope with adaptations that occur in response to local events or changes in the wider system
  • Evidencing counterfactuals for individual children and young people is difficult – i.e. being able to say this child would have come into care without this service.

As well as formal academic research and local evidence, commissioners also drew on the experiences of other local authorities, through informal networking – this was felt to have benefits of being able to get more detailed feedback, and in particular, learning what hadn’t worked as well as what had.

The format of research influenced whether commissioners found the time to ‘dig into the detail’

  • Executive summaries with key messages were preferred, in order to be able to make a quick judgement about the relevance of the paper as a whole.
  • Charts and diagrams helped commissioners to quickly understand the impact evidence
  • There was some interest in alternative formats for research, such as podcasts and videos, but there were often barriers in accessing these formats at work due to technical limitations.

Recommendations for commissioning teams and their managers

  • A mix of skills in reading and critically appraising research is important to make the most of the available evidence. Developing these skills should be part of the professional development of commissioners. This can be done through integrated commissioning teams, a corporate intelligence and insights service or cross-authority collaboration.
  • Access to research is a crucial tool for commissioning teams. While there are some free and trusted sources, sometimes this requires a budgetary commitment to get access to current and relevant research.
  • Getting out to conferences where research is presented, or meeting with commissioners in other authorities, provides opportunities for professional development as well as increased awareness of the available evidence
  • The organisational culture around use of evidence will shape how individual commissioners work. Giving commissioners a clear steer about the extent to which new services should be ‘evidence-based’ will help them to prioritise the most useful research. Are you looking to innovate based on theory, or adapt an existing programme to fit your local system? How should commissioners balance evidence-based services with co-design and user voice?
  • Consider the different services you commission and the culture of using evidence within those service areas. Do you use the right language and consider the right research to best influence these decision-makers?

2. Introduction

This report is the result of two telephone focus groups held with local authority commissioners of children’s services. The 18 participants worked in local authorities of a range of sizes, locations and levels of deprivation and came from a variety of backgrounds, including health and academia as well as operational and management roles within children’s social care. Commissioners were responsible for, or involved in, the commissioning of a range of services for children:

  • Across different levels of need, from universal play and youth services to services for looked after children and care leavers
  • Across different types of need, including family support, domestic abuse, mental health, specialist health services for children with complex needs and education, employment and training (EET) support
  • Across different agencies, with some participants working in integrated teams supporting commissioning in public health and social care, and others across children’s and adults social care, and in one case all of the above.

This range of experience provided rich perspectives and learning about what evidence is used in commissioning and how it is used in practice in local authorities in England.

3. Commissioning in context

Austerity

Unsurprisingly, participants felt that their role and the way that they used evidence had been affected by austerity, though how individual authorities were responding varied.

One participant reported that her mandate was to be transformational, looking at new ways of commissioning and new ways of meeting need to reduce system-wide costs, while another reported rarely having to consider the evidence base for new services, because many providers have been in place for ‘decades’ and there was little appetite for change. These existing relationships were seen as valuable in negotiating service reductions as budgets reduced.

For some, the need to deliver value for money required a rigorous focus on ‘what works’, with robust evidence providing reassurance that limited resources would have the desired impact. But for others, evidence of effectiveness produced in less austere times was of limited relevance to their current situation – ‘it’s like something from a different world’ – and this meant that programmes and interventions needed adapting to fit the available resources, or more innovative (and therefore less well-evidenced) services needed to be explored.

Innovation and evidence

Throughout the focus groups, participants highlighted the tension between innovation and evidence-based services. One participant highlighted the need for ‘defensible decision-making’ when trying to cope with cuts and felt that opting for services with evidence of effectiveness was an important (though not the only) element in demonstrating that the consequences of budget reductions had been thought through. Over-reliance on services with an evidence base, however, was seen by others as ‘shrinking our world’ to a limited number of options. Allowing space for innovative service development was seen by some as a crucial part of the response to reducing budgets, and to meeting the needs of individual young people and families.

From my experience, some of the smaller organisations have some out of the box ideas about how to work with families and some of our most challenging young people and we do need to test that out

Innovation did not mean ignoring evidence of what works, but using it in a different way. While some participants described innovation as ‘a leap of faith’, others argued that by developing a strong logic model, with evidence for each step of the model, commissioners could have a reasonable amount of confidence that an innovation would work.

Engaging stakeholders

Commissioners are not making these decisions in isolation, and a number of participants described their role as to ‘persuade’ or to ‘justify’ their recommendations to senior management, and crucially to politicians, who have the final say on budgets and services. Again, the extent of political involvement in commissioning decisions varied, in part due to local culture of being ‘member-led’ and in part whether the proposals were a significant variation from existing patterns of provision. Some commissioners reported political direction at the beginning of the process, for example a direction to consider digital and self-serve solutions wherever appropriate, or to build the capacity of the local voluntary sector and generate broader social value. For others, members were involved throughout, or at least this was felt to be the most successful strategy for member engagement. The consequence of this is that the commissioning process from evidencing need through options appraisal to formal procurement can take two or more years, ‘by which time, the world has changed’.

4. Types of evidence

Participants highlighted different types of evidence used at different point of the commissioning process:

  • Evidence of local need, derived from national and local data, surveys and qualitative research with young people and families and feedback from practitioners
  • Evidence about ‘what works’, or what might work here, derived from research produced by academics, voluntary organisations and government and informal intelligence from other local authorities
  • Evidence from the market about what is possible locally, given the skills and capacity of organisations, including the local authority itself, to provide the desired service.

Evidence of local need

Evidence of local need provides the basis for identifying gaps in, or improvements to, current service provision. Understanding local need is important for understanding the potential demand for, and therefore capacity of, any proposed service. The commissioner’s role is to “make sure people have thought through” why they want a particular service and what they think the benefits will be and to whom.

Commissioners gather evidence of need from a range of national data sources which provide statistics about the prevalence of particular needs and the use of services locally. Examples provided included:

  • understanding projected population increases and how that might affect demand for services
  • comparing looked after children numbers and populations with similar authorities
  • using Hospital Episode Statistics to understand use of acute services.

This use of data is not a straightforward process, but requires a good understanding of local context of comparator authorities and where data comes from in order to make robust estimates of need. Even then, the data are often insufficient to make robust estimates of demand in some cases:

  • Demand for new preventative services is particularly hard to estimate. One commissioner talked of being overwhelmed by demand for a preventative mental health service because of the hidden nature of previously unmet need.
  • Demand can be variable within the authority, particularly in large county authorities. How need is distributed within the authority can be difficult to identify from national data. Authorities relied on local intelligence and data from other local agencies and services to understand patterns of need, and how service provision should reflect that.

A dominant theme of both focus groups was the importance of the voice of children and young people in identifying gaps and improvements for services. Some participating authorities were putting significant resources into finding out what kinds of support children and young people felt they needed and were not getting: in one authority a survey designed and delivered by young people had reached over 4000 responses and was being used by the local authority, schools and local voluntary services to evidence need and support bids for funding.

Participants identified a tension between ‘what young people want’ and ‘what we know works’, though many felt that this could be resolved by asking young people about the outcomes that they want to achieve, rather than the way in which a service might improve those outcomes.

My experience is that they don’t want something outrageous, they just want decent services

Identifying and describing the desired outcomes was seen as an important step before looking for evidence of what works to achieve those outcomes. This informed the way that commissioners looked for and interpreted evidence, for example one participant reflected on the study on the effectiveness of the Family Nurse Partnership, which found that FNP is not effective in improving antenatal smoking, birth weight, A&E use and second pregnancies. However, these outcomes were not the most important outcomes of the programme from the commissioner’s perspective, instead they were interested in the effect on parental confidence and selfesteem, and on social care referral and care entry. The study had not, therefore, had much influence on their thinking about whether to continue commissioning the service.

Using a very good approach to evaluating evidence doesn’t matter if you are measuring the wrong thing.

Evidence about ‘what works’ (or what might work here)

Having identified local need and defined the outcomes that they wanted to achieve, commissioners use research evidence to understand what works, or at least, what might work in their context, to inform their appraisal of options for meeting those needs. Commissioners were acutely aware of the importance of context when translating evidence of effectiveness from one place to another, or from an academic trial into real world practice. A difference in context could reduce the potential positive effects of an intervention and needed to be carefully considered, particularly when arguments relied on a return on investment. When considering whether a proposed service would ‘fit’ with the local context, participants considered both the context in which the evidence was produced and their own local context – looking for similarities and differences between the two.

Evidence from England was considered more likely to fit than evidence generated abroad, though international evidence was being used to inform approaches to family support in at least one area. Where evidence had been produced in English local authorities, it was important to understand that authority’s journey and their starting point, to understand the relevance to the local situation. To do this, commissioners considered:

  • Ofsted judgements
  • historical trends of demand
  • types of service provision, for example the pattern of placements between external and in-house providers.

Translating international evidence required even further investigation into the wider system in which the intervention had been shown to be effective, and rigorous questioning of whether the evidence would apply in the UK. Participants described this approach as ‘being a bit braver’

Evidence about what is possible here.

Understanding the local context and system is the other half of the ‘fit’ calculation.

We know what we’ve got and what we’ve not got and that drives you from the start

Any new service would have to complement, not duplicate or ‘cannibalise,’ an existing service. Commissioners were aware of the destabilising effect that commissioning a service could have on other teams, services and agencies and sought to avoid those effects through early engagement and discussion, and for some, new integrated commissioning teams actively pursued shared priorities and pooled budgets.

The capacity of the local market was an important consideration when considering a new service. Commissioners assessed whether there was in-house or external capacity to deliver a service to the desired specification, and considered the wider recruitment market locally, if services are dependent on particular skills.

> Services that require qualified psychologists were cited as an example where recruitment difficulties are well-known, leading to caution in commissioning services were these qualifications are required to be faithful to the model.

> Some evidence-based interventions were felt to be too difficult or specialist for local providers to deliver, and where there is a sole national provider, commissioners reported having to make special dispensations to procure from a specific provider, rather than through a competitive tender.

Commissioners were actively working with providers and in-house services locally to drive the creation of innovative services, informed by evidence. In some cases, this was making slight adaptations to something that had worked elsewhere to fit the local system, but in other cases commissioners reported drawing on the theory or logic model to design a local service from scratch.

We were looking at commissioning respite overnight as a tool for developing independence skills, so you can see at the core of that there will be research evidence around why that is a good thing and what kind of interventions are likely to be successful, and then we are using that to develop a service…. facilitating contact between service providers, the voluntary sector or alongside profit making organisations

Commissioners felt that a focus on outcomes had the potential to prompt a different relationship with providers and potential providers. By specifying outcomes, providers could be given more flexibility in how services are provided and outcomes are met, either at the pre-procurement stage when the service specification is being shaped, or with the development phase built into the contract, where the successful bidder works alongside the authority to develop a new service. This requires a significant amount of trust and ongoing communication. It also requires building in flexibility, particularly where a service is new, so that the provider can adapt their approach as local needs become clearer. Some commissioners felt that an expectation on the use of evidence could be built in to these outcome-focused agreements, leaving it to providers to access and develop the evidence base.

5. What makes research evidence trustworthy, robust and useful?

Evidence is useful when it is verifiable, peer reviewed studies for example. That doesn’t mean double-blind randomised trials because that’s unrealistic, but decent, trustworthy research sources and research agents

How commissioners accessed and interpreted evidence varied somewhat, and this appeared to be a reflection of their own background and knowledge; the types of team that they worked in; and which service areas they covered. It was noted that different service areas have different cultures of using evidence, and use different language to describe it. Commissioners needed to reflect the traditions and priorities of the organisations that were paying the bills, with one commissioner in an integrated health and social care commissioning team noting:

If I am commissioning for the CCG or public health, they will have much, much higher expectations of the academic quality of the research included in a literature review. But that is in part because there is much more academic evidence available relating to those services and because of the way that they have evolved.

Education services were seen to be less keen on randomised controlled trials and less comfortable with the idea of commissioning based on evidence, and it was felt that, particularly around support for children with special educational needs, there was a lack of useful research to draw on.

Trust

Where commissioners worked in teams with the knowledge and skills, literature reviews were found to be a helpful way of summarising key messages about effective practice from multiple studies. This process might identify and recommend particular interventions, but more likely identify elements of effective practice that should be incorporated into any re-design of services. For others, the process of finding evidence was more haphazard – “Google is the first port of call”. Repositories of evidence, particularly when they included a critical appraisal of the evidence base, were highly valued. Organisations such as Research in Practice, the Early Intervention Foundation and NICE were trusted sources of evidence synthesis and key messages.

Commissioners described the pros and cons of different producers of evidence and how this influenced what evidence they drew on in decision-making.

  • Academic research from universities was seen as independent, robust and reliable, but it was often inaccessible due to paywalls, with local authorities unable to afford access.
  • Research produced by the voluntary sector provoked more reflection about the purpose and the objectivity of the findings. Commissioners were aware that many national voluntary organisations have a lobbying function, as well as research, and some felt that the blurred line between the two raised questions about the validity of the research. Participants felt that some kind of external validation of research produced by voluntary sector bodies that are also providers of services would give more reassurance of objectivity.

Robustness

Participants reported different levels of confidence and understanding of research methods and how to interpret them, with some having very little knowledge and relying on third-party analysis, while others were deeply involved in the critical appraisal of primary research as part of their role.

For those confident in handling data, quantitative studies that provided models for projecting benefits or savings were felt to be helpful, by providing commissioners with tools to monitor the programme in the real world, and compare it to the original study findings. Transparency about the tools and methods used in the study itself was felt to be important, to allow commissioners to judge for themselves how much weight to put on the findings, for example stating how outcomes were measured, whether by professionals or self-report or the use of validated tools.

It was noted that it is more difficult to judge what weight to put on qualitative studies reporting on service user feedback on receipt of services.

It could just be one or two children, you really don’t know how representative it is.

Nonetheless, the human stories and case studies included in research reports were described as more ‘persuasive’ than a report containing simply data, and particularly when commissioners were trying to ‘make the service come alive’ for decision-makers. Mixed methods that combine quantitative data and qualitative feedback were thought to be the best sources of evidence.

Utility

It was important for commissioners that research evidence provided them with relevant and useful information for implementation, including:

  • the capacity of the service and whether it would scale up or down to fit the size of the commissioning authority
  • the qualifications of the professionals delivering support, “obviously the cost of having highly trained professionals is huge – it could be four times as much depending on the qualifications.”
  • the costs of delivery, and how these costs are incurred – this was important where commissioners were looking to deliver a similar service at reduced cost.
How are the services set up and are they the same as ours, and what is … the resource required to deliver that model and then compare it to our financial envelope and then see if the model can be tweaked to fit our money and our system.

6. Accessing research

Sources

As noted above, commissioners seek out evidence to answer specific questions to inform decision-making and look to trusted repositories of evidence in order to find the evidence they need. A number of participants also received regular email alerts or bulletins from a range of organisations, including the Association of Directors of Children’s Services (ADCS) and the NSPCC. These were found to be useful because they arrive regularly and unprompted, spark new ideas and can be circulated to colleagues.

Participants noted the value of face-to-face discussions with either the academics who produced the research, or the local authority where the research was conducted. This was felt to allow deeper exploration of the methodology and a more nuanced report of findings. In some cases, commissioners reported that the local authority where the research was done was more cautious about impact than the research report implied, or finding that results had not been sustained after the initial pilot period.

Some commissioners were attempting to collect evidence of effectiveness locally, by collecting outcome measures from existing services to understand impact. One authority was evaluating a new approach to commissioning, distributing grants through a philanthropic organisation rather than commissioning specific services and had commissioned a university as a research partner to investigate the impact on capacity and sustainability in the voluntary sector.

Generating local evidence of a similar quality to formal academic evidence was felt to be challenging for a number of reasons:

  • Agreeing outcome measures and tools across services to get comparable results requires substantial negotiation and quality assurance
  • Evaluating services operating in a ‘real-world’ environment needs to be able to cope with adaptations that occur in response to local events or changes in the wider system
  • Evidencing counterfactuals for individual children and young people is difficult – i.e. being able to say this child would have come into care without this service.

Beginning with an intervention with robust evidence was felt to counteract the risk that local evidence would be insufficiently robust to demonstrate impact.

[We] start from the assumption that the intervention we are delivering is a robust intervention rather than trying to demonstrate every time that we have delivered positive outcomes. …. We would prefer an evidence based programme, it is evaluated, it is independently verified, rather than saying we have the evidence that it works here from delivery with 300 children, which would have less weight.

Formal research evidence wasn’t the only source of ideas and learning. Regional benchmarking groups provided another source of evidence – other authorities that had shown a reduction in looked after children numbers, for example, could be approached for ideas about what had worked and why, and whether it might work elsewhere.

The evidence isn’t tested, but there is something to be said for knowing what other authorities are doing and have done with providers and services, and taking from that some of their learning,…. particularly in the local context, because for many of us we are working with the same providers across artificial boundaries.

It was noted that these informal discussions also provided information about what didn’t work, as well as what did, which can be equally valuable intelligence for commissioners, but rarely appears in formal research.

Formats

In terms of report formats, the executive summary was seen as crucial for determining whether the research was relevant and ‘worth my time to dig into the detail’. Key information that commissioners wanted to see in the executive summary related to costs, impact and information about implementation –‘how do I put this into practice?’. Reports written specifically for commissioners, such as those produced by the Independent Children’s Home Association, were reported as being in a particularly useful format, with costs and impact stated up-front. One participant suggested that the executive summary should serve as an index to the full report, allowing the reader to easily navigate to the relevant sections for more detail as needed.

Respondents preferred reports that are visually appealing, with a number stating a preference for infographics and charts alongside text. A tension was noted, however, between the use of pictures and the cost and feasibility of printing. Many participants sheepishly admitted to preferring reading printed copies, highlighter in hand. For one participant, reading research papers was an opportunity to get away from the screen for a while.

A couple of participants expressed a view on more innovative formats, such as podcasts and TED talks. Some participants were attracted to the idea, to allow access research at home or during a long commute, though two participants noted the IT barriers to accessing audio or video online while at work.

7. Recommendations

Recommendations for producers and communicators of research

  • Commissioners find mixed method studies the most useful, giving them both robust data on outcomes and costs and case studies containing user voice that ‘bring the service to life’.
  • Many commissioners are identifying the outcomes that they want to achieve first, then looking for research to help them achieve those outcomes. Due to a lack of time, commissioners are rarely speculatively reading research. Clearly identifying outcomes achieved and how these outcomes were measured in the executive summary and other publicity is likely to attract more readers.
  • Context matters to commissioners. While authorities involved in research might prefer to remain anonymous, important details about the demographics, needs, service structures and performance of the local authority need to be included.
  • Commissioners value ‘fit’ over ‘fidelity’. In the face of shrinking budgets, many commissioners are reluctant to adopt evidence-based programmes in their entirety. Commissioners are looking to adapt successful interventions or draw on their theory of change to design and deliver their own localised versions.
  • Commissioners are busy people. They use trusted sources of research first, and are unlikely to get beyond the summary if it does not attract their interest.
    • A clear, concise executive summary tailored to their interests is most likely to get their attention. This executive summary should contain information about context, costs and impact.
    • Making use of third-party research repositories to store and validate research might increase reach and impact.
    • Email bulletins are an effective way of raising awareness and are circulated beyond the original recipients.
    • Conferences and word of mouth recommendations are important sources of information – promoting research through these face-to-face channels might increase reach.
  • Commissioners are not the only people who need persuading, in fact many see their job as persuading others to ‘follow the evidence’. Giving commissioners the tools that they need to persuade budgetholders and decision-makers would be helpful. Different stakeholders will be persuaded by different things – the Clinical Commissioning Group might want data on the impact on acute services, while elected members might respond better to personal stories.

Recommendations for commissioning teams and their managers

  • A mix of skills in reading and critically appraising research is important to make the most of the available evidence. Developing these skills should be part of the professional development of commissioners. This can be done through integrated commissioning teams, a corporate intelligence and insights service or cross-authority collaboration.
  • Access to research is a crucial tool for commissioning teams. While there are some free and trusted sources, sometimes this requires a budgetary commitment to get access to current and relevant research.
  • Getting out to conferences where research is presented, or meeting with commissioners in other authorities, provides opportunities for professional development as well as increased awareness of the available evidence.
  • The organisational culture around use of evidence will shape how individual commissioners work. Giving commissioners a clear steer about the extent to which new services should be ‘evidence-based’ will help them to prioritise the most useful research. Are you looking to innovate based on theory, or adapt an existing programme to fit your local system? How should commissioners balance evidence-based services with co-design and user voice?
  • Consider the different services you commission and the culture of using evidence within those service areas. Do you use the right language and consider the right research to best influence these decision-makers?

Digital download

Your price:

Free

Please note: If applicable, VAT charges applied will be detailed on your invoice(s). This sale is not subject to VAT.

Digital download

Your price:

Free

Please note: If applicable, VAT charges applied will be detailed on your invoice(s). This sale is not subject to VAT.

Professional Standards

PQS:KSS - Support effective decision-making | Quality assurance and improvement

PCF - Contexts and organisations | Professional leadership