What is evidence-informed practice?

Published: 07/08/2023

Dez Holmes, Director of Research in Practice, explains what evidence-informed practice is, why it’s of particular relevance to the social care sector and what it means for practice and leadership.

What is evidence-informed practice? 

Dez Holmes, Director of Research in Practice, explains what evidence-informed practice is, why it’s of particular relevance to the social care sector and what it means for practice and leadership 

Talking Points 

This podcast looks at: 

  • What Research in Practice mean when we talk about evidence-informed practice. 
  • How evidence-informed practice is different to evidence-based practice.
  • Why the construct of evidence-informed practice matters in our sector.
  • Research and policy related to evidence-informed practice.
  • What the concept of evidence-informed practice means for the way that we think about lived experience.

[Introduction]

This is a Research in Practice podcast. Supporting evidence-informed practice with children and families, young people and adults.

Dyfrig: Welcome to the Research in Practice podcast. I'm Dyfrig Williams, Head of Learning at Research in Practice. In this podcast, we'll be looking at evidence-informed practice. Today I'll be talking to Dez Holmes, our Director at Research in Practice. Dez, can you introduce yourself, please?

Dez: Hello, my name is Dez Holmes. I am the Director of Research in Practice as you say, real pleasure to be talking to you today.

Dyfrig: In today's podcast, we'll be looking at what we mean when we talk about evidence-informed practice, why it is of particular relevance to the social care sector and what it means for practice and leadership.

[What do Research in Practice mean when we talk about evidence-informed practice?]

Dyfrig: So Dez, can you tell me what does Research in Practice mean when we talk about evidence-informed practice?

Dez: We’ve used the term evidence-informed practice for most of our, what are we now, twenty eight year history in fact. And it's becoming much more common in practice in recent years. What we are getting at, at its most simplistic, this triangulating multiple sources of knowledge and evidence. So of course using robust, relevant, up-to-date research and data and combining this with the knowledge held by professionals. Sometimes called practice wisdom or tacit knowledge, the things we know from the work we do, and thirdly, crucially for us, we see the expertise born out of people's specific lived experiences as another form of evidence. So for many of the questions that we're trying to help our partners and other people we work with answer, we're encouraging them, trying to support them, to use these multiple sources of knowledge. It's a bit like plaiting bread, you know. You're bringing together different sources. I guess we would argue it's a tapestry, not necessarily a hierarchy, so we don't hold one of those things (lived experience, practice wisdom or research) automatically above the others in a hierarchy. It's about fitness for purpose per question that we're trying to answer.

So we've been really influenced over the years by academics in this space. People I want to give a shout out to are Sandra Nutley and Huw Davies and Vicky Ward. Some really quite sophisticated thinking that has taken us beyond rolling out research evidence into this notion of evidence-informed practice. So, the approach we take really complements the work of, for example, places like What Works Centres. It gives us a little space to attempt the “What if? What matters?” questions, as well as those “What works?” questions. Recognising that some of those questions facing our sector, those “What if? What matters?” questions can be very hard to answer through experimental studies alone.

[How is evidence informed practice different to evidence-based practice?]

Dyfrig: So to delve in a little deeper, can you tell me how evidence-informed practice is different to evidence-based practice?

Dez: Yes, it's a great question and it's important I say at the start that the term evidence-based practice has many, many merits. So the fact that we use evidence-informed practice is not a criticism of evidence-based practice per se, but there are some differences. So the term evidence-based tends to be understood as implementing quite discrete interventions or programmes that have been very rigorously tested and tested in a very precise way in order to try and replicate the same results. When you implement what they found in the testing. Which sounds absolutely grand, and is for a number of discrete interventions or programmes. The term originally came from medicine and medical research and so for good reason it relied on a, sort of, a ranking of methods. A hierarchy of methods for testing. And some commentators over the years have argued that that approach, to sort of, ranking in order priority, the best ways of testing effectiveness, can mean that standardised interventions, discrete interventions, a beginning and an end with a very clear recipe behind them, often are much better served by that kind of approach to experimentation or to testing. And unfortunately, that can sometimes mean that some qualitative methods become delegitimised, so things like user-led research or participative action research, or our ability to harness community-based knowledge. Now, all of those things, of course, are very important when we think about the kind of what if, what matters, and what works for whom questions that our sector focus on.

One of the other critiques of that more traditional approach to evidence-based practice or evidence-based medicine for example is that… it sometimes characterises what's called knowledge transfer as if it's a one-way process, you know. You kind of imagine that the clever stuff gets created over here somewhere, and then gets communicated or transmitted into practice, you know. And that can position professionals or policymakers as if they're massive recipients of research knowledge. So that for us doesn't quite work and we take a different approach, and I can talk a bit more about that in future. And the other critique that is sometimes applied to evidence-based practice in the way that I describe, is that not only does it sometimes struggle to attend to approaches in the way we work that aren't prescribed or standard interventions, and not only does it preclude valuable sources of knowledge, and not only does it suggest that knowledge only travels one way, but it could also suggest that there is a definitive answer or one correct approach and that isn't always the case, of course. Particularly when we think about really diverse needs, really diverse contexts, you know. What works in Wigan on a particular issue might not work the same way in Weymouth or indeed Wandsworth. So we really want to think about local context, diverse communities and recognising what that might mean, however difficult this is to confront. That might mean there is no simple, singular answer or correct approach.

[Why does the construct of evidence-informed practice matter in our sector?]

Dyfrig: Taking that then, thinking about complexity, why does the construct of evidence-informed practice matter in our sector? Why is it important?

Dez: So, if I think about one part of the sector that Research in Practice is particularly involved in supporting around social care, social work, family support, early help, that kind of wider body of work. It's kind of worth acknowledging that, you know, adherence to traditional standards of evidence can be very, very helpful. I really want to stress that point. If we think about literacy programmes or particular interventions for supporting parents experiencing perinatal mental health difficulties, all that kind of stuff. You know, there is absolutely a vital place for those evidence-based intervention programmes. But, in the context of our work there are a few things that we also have to recognise. There is a lack of randomised control trials or experimental trials of other types which would often be considered the best quality, the best type of approach to testing and intervention. And of course, we can't simply sit here and wait for there to be a magic money tree where millions more will be spent on RCTs (Randomised Control Trials) or other studies. And it's fair to say those kinds of methods, they might not be appropriate to some of the practice challenges the social care or wider support sector. Though, there was an interesting debate just a few years ago about trying to establish a randomised control trial in relation to family group conferencing and some colleagues in the sector felt very, very strongly that that should be an entitlement, and that there are really significant ethical issues about having a control group who weren't given that offer, that there should be an entitlement. So, there is a conundrum there for colleagues trying to create best evidence.

Others in this field, researchers in this field, have highlighted the potential for what's called, therapeutic nihilism. So we don't have any trials, we can't prove what works, therefore, maybe nothing works. Well, with the sector under as much pressure as they are, I think avoiding therapeutic nihilism feels pretty important for all of our resilience. Again, as I've touched on, some colleagues highlight that the de-prioritising of some quality of methods that can sometimes happen under evidence-based practice constructs. That can mean that those very methods that are best suited to understanding, let's say, relationship-based practice, actually become de-prioritised or devalued. So there's a disconnect there between the method and the content that is being explored. For us, one of the really important factors here is sometimes that approach to, sort of, trials led creation of evidence and the prioritising of that, it can be that user-led research has minoritised scholars, for example, and others are excluded. And that can mean the way we generate evidence can actually reinforce the inequity, the marginalisation, the discrimination that we're all here to try and disrupt and address. So that was a big part of our thinking of things and our values.

There's something around, I think for us, that we've really learnt since 1996 I think, that Research in Practice first started working. And I think one of the really enduring messages that we have found is taking this evidence-informed approach, drawing on multiple sources, can really support that ability to ask better questions rather than seek the perfect answer. It can allow us to be a bit more pondering in our approach, curious, it can create that sense of dialogue rather than direction, and that can be a really good fit for practice, given you know that a lot of the work we do is trying to help colleagues who work in highly emergent issues. Criminal exploitation would be an example here. The evidence base in terms of research is not anywhere near complete, so it's highly emergent.

A lot of the work that we help colleagues do is in complex systems, strength-based approaches across the whole of how we work with adults in a multi-agency way. The role of professional judgement, particularly in social work, is a really, really key issue here. So we try to always think about how we enable evidence literacy, not just research compliance where we're trying to model that sense of engagement in the process. It's about curiosity and critical thinking, and analysis and using evidence from multiple sources in combination with your values. Not simply always just following the manual. Although again, I'll just stress that where we have good trials in place and well-evidenced interventions, it is often very, very important to follow the manual.

[Research policy]

Dyfrig: So, is there any research policy that will be of interest to people in the sector?

Dez: Absolutely. I mean, I'm really, really pleased to say that, you know, it's certainly not just Research in Practice who talk about this approach to generating and implementing evidence. It's become increasingly common to talk about, for example, lived experience as a source of knowledge. I might touch on lived experience in a moment, if that's alright Dyfrig. You can see, for example, shout out to colleagues at the University of Birmingham, Jon Glasby and the team who are running the impact centre which is generating all sorts of new knowledge around how we use evidence in adult social care. The recent children's social care review that the DfE [Department for Education] commissioned, had that very clear approach to having an expert by-experience group, a research evidence group and then a group of colleagues who worked in practice. So, there is this growing sense of multiple sources of knowledge being really, really important. Really interesting research recently, a piece that I read not long ago that really struck a chord with me was from Allison Metz and Todd Jensen and others, and they really highlight that even when we're implementing very evidence-based interventions and which you might imagine to be, you know, follow the manual type approaches. What they found, and what they argue, is that it's the relational strategies that matter, not just the technical strategies. They really highlight the importance of developing very trusting relationships across all stakeholders, you know. That sense of “This a social dynamic”, a relation endeavour, it's not simply a technical “We're going to roll out an evidence-based programme” kind of approach. Links to the work of Vicky Ward and others who I've also been really impressed by. Vicky in particular has done some lovely work looking at the qualities and skills of knowledge mobilisers, and we often think about our Research in Practice Link Officers as being knowledge mobilisers. They are, you know, drawing ideas and information and challenges in from their colleagues in practice, and then, sort of, digesting these and trying to help them use the evidence-informed resources that we've produced.

This sense of, it's who you are and how you behave, it's not just what you know. I find that really compelling. It makes me think, particularly when I think about knowledge mobilisation, it makes me think of the literature around boundary spanning. There's a cracking paper, it's about 20 years old now, it's by Williams. I think it's called something like the competent boundary spanner and I use it all the time, because it talks about how in leadership roles or indeed in knowledge utilisation roles, we're always trying to span boundaries between practice and research, between practice and leadership, between qualitative and quantitative data, you know. Deliberately trying to squirm our way into spaces between the silos and that to me feels really, really relevant for the sector that we work within. And it speaks as well I think to, I touched earlier on the idea of one-way knowledge transfer, that sometimes evidence-based practice can suggest to people a one-way transfer. We in Research in Practice, we prefer the language of knowledge mobilisation. I think it, for me certainly, it's about respect. There's a researcher, Lawrence Green, who wrote a paper a few years ago and talks about the fallacy of the pipeline. That fallacy of the pipeline, that fallacy of the empty vessel relies on the idea that practitioners or policymakers or leaders are just sitting there waiting to be filled up with somebody else's academic knowledge, and that's not our experience at all. And again, if we think about the practice values of the sector we support, we wouldn't want any citizen to feel like a passive recipient of professional intervention, we want people to be active agents in the work that they're doing with a given professional, you know. You think about social work, youth work, occupational therapy, our practice in our sector at its best, and it's about partnership and collaboration with a person being supported.

[Lived experience]

Dyfrig: So, going back to the point that you made around lived experience, what does evidence-informed practice mean for the way that we think about lived experience? Do we need to think critically about lived experience?

Dez: I think we do, yes. And I've… I suppose I've changed my mind a bit about this over the years. You know, if you were doing this podcast, you know, 5-10 years ago, I'd probably have ranted for 45 minutes about how people don't respect lived experience as a valid source of knowledge. And more recently, I think I take a more nuanced perspective. It is, of course, great that we're seeing an increased commitment to respecting and valuing people's lived experience in, for example, policy, formulation, research generation activity. But I think we do think a little critically here. I think back to my point about being a tapestry, not a hierarchy, and I think when we see across our work across Research in Practice is that… people's, the stuff we know because we've lived it, is absolutely vital, but it is not definitive any more than the stuff we know because we worked or because we studied it is definitive. Very few, if any sources give a definitive answer. They help you ask better questions when you take them collectively and consider them as a whole.

There's also an issue with the language, isn't there? Lived experience of what? I found myself in a group not long ago where someone said we need to make sure we've got young people here who bring lived experience and I was like “Of what? Being young people?” Because I'm kind of hoping we all have that experience, none of us were born at the age of 42, were we? There's often a lack of specificity, do we mean lived experience of being supported by domestic abuse services? Lived experience of having a physical disability and being worked with by a social worker, you know? That specificity really matters. You wouldn't just invite any researcher to talk on any topic just because they're a researcher, you know. So it's that same sense of discipline in the thinking.

The other thing equally, I know it sounds obvious, you know, my lived experience is just that - it is only my lived experience. I might choose or try to talk on behalf of all overweight, brunette Virgos who grew up in Wales, but I should not be allowed to Dyfrig. Because my lived experience is only my lived experience. One is a very small sample size to base any practice or policy decisions on. There's something about how we go to… were we able to go sources for courses. Depending on the question in hand, we need to go to the right source. I would not dream of asking even the most skilled and knowledgeable researcher who spent years analysing the quantitative data on care leaver outcomes to tell me what it's like to leave care. In the same vein, I wouldn't ask a person who's had their child removed into care what works for all parents who've faced that kind of trauma. To set about part of respecting that sort of knowledge is asking the right questions of it.

Part of what straightened my thinking is some really excellent commentary in recent years that's just been such a helpful provocation to the accepted orthodoxy of “Lived experience matters.” Well of course it does, but then what? So, there was a blog recently, we must put this in the show notes, it was a blog authored by a guy called John Radoux who himself brings lived experience of growing up in the children's care system and now works as a professional in a therapeutic role. He writes very, very well, and he wrote a blog which stopped me in my tracks. The title alone is a great provocation. He talks about the fetishisation of lived experience and he makes all sorts of thought provoking points, one of which was along the lines of… and forgive me John, I'm probably going to mis-quote you here, but talking about “If you're building a hospital, it might be important to talk to patients, but really you need to engage an architect and ideally get that architect to talk to patients. Not privilege someone's experience of being a patient over someone who understands where to put load bearing walls and what kind of double glazing we need. That again sounds really obvious, but I sometimes think we're not as disciplined as we could be in thinking about inadvertently privileging some knowledge of expertise over others.

There are two other points I'd make. The first is that we must not assume that research knowledge, practice wisdom and lived experience exist only in separate people and they're mutually exclusive. Many, many people go into practice because of their personal experiences. Many people go into research because of their practice and indeed their personal experiences. So again, it's like plaiting bread. These sources of knowledge are not mutually exclusive. We have no idea when we sit in a meeting of other professionals and we talk about, let's say mental health, and we say “We need a mental health service user in, to tell us what it feels like.” Well, if one in 4 of us experience mental ill health in our lifetime, how on earth do we think that no one in that room of professionals brings lived experience? So just being a little cautious that we don't assume mutual exclusivity.

The last point, and there's colleagues up and down the country who do really good thinking on this and offer a lot of challenge, is that some of our methods and approaches to accessing people's expertise through their lived experiences of particular issues can be tokenistic. They can be even downright exploitative, if we're going to be really tough on ourselves. Do we think for example, again I'm influenced here by a great blog I read a year or 2 ago during national care leaver week, where a young person who had experience of being in care was talking about how traumatising and triggering it is during this week. “I hate this week because you get suddenly loads of people asking you to speak at conferences, sometimes for free, sometimes for ten minutes, sometimes where they won't pay your travel.” That feels not only tokenistic, but also quite harmful. Quite exploitative in that harnessing of their very private stories in ways that are not always ethical, and I think in some of our work, this extractive approach… “Come and tell me your most personal, private, painful story, Dyfrig, tell it to a room full of two hundred people. In fact, do you know what? Why don't we film it? Why don't we record it? Now, even if you gave your freely given, fully informed consent at the time, what if you feel different next year? It's already on YouTube. What if you feel different in 10 years' time when you're training to be an occupational therapist yourself? I would go so far as to say that in some sectors, particularly when we're thinking about things like knife crime and so-called youth violence or serious violence, we want to pay real attention I think to the potential colonial aspects of this work. It can be a really extractive industry if we're not careful.

Certainly, I will agree with most of the country I'm sure, that people's lived experiences are a really important source of knowledge that we must respect and honour and validate and draw on to make complex decisions. But it's a tapestry not a hierarchy, and we have to do it well, and that might involve us thinking quite critically in how we do so.

[What does it mean to be evidence-informed in practice and what does it mean for leadership?]

Dyfrig: What does it mean to be evidence-informed in practice and what does it mean for leadership too?

Dez: That's a good point isn't it? Even the language of evidence-informed practice. Pretty flawed because people think it only means direct practice. But I think we would say… we're also talking about management practice, leadership practice, commissioning practice, policy making practice, it's those multiple levels. So if we think about a person in direct practice working with an adult or child or young person, it would mean, of course, drawing on and considering those multiple sources. What do we know from the best available, most up to date, relevant, robust research? What do I know as a professional working in this space and what can I learn from other professionals? My manager, my supervisor, my peers, my multi-agency partnership colleagues and what can I learn from the person I'm serving and, or, other people I've served and supported? How do I draw on these multiple sources of knowledge to inform my decision making in my approach? Always pay attention to context and finding a way where you're holding both hands, holding both the evidence base in mind and thinking about your own professional values. Why do you do the job you do? What matters to you in terms of your values, your morals, your professional code of ethics? If you work in a role that has one. For me I guess, it's about curiosity. Always wondering “What if? What matters? What might work? How would I know if it wasn't working? What evidence is there to suggest that this might be a good path to pursue with this particular family? With this particular older person I'm trying to help live an independent life? How do I know that? How legit are the sources I'm drawing on? How would I know if I were making a mistake? How can I ensure that I feel safe enough to say if I'm making a mistake? Who do I go to if I need to course correct?' That sense of curiosity and critical thinking seems really paramount here.

I think, always a nice example actually, one of the really popular bits of work that we've done, supported by the excellent Professor Danielle Turney who is one of the original authors. Years ago, we did a project that was about analysis and critical thinking in assessment. The group of professionals we worked with produced a set of Anchor Principals about understanding why we're telling this story and what the story is and what would need to change and how do we know things are changing? Really quite simple, but quite profound statements. And what's been so interesting to see over the last decade or so is how that framework for analysis and critical thinking “What evidence source am I drawing on? How do I know they're the right ones? How do I know they're robust?” has been equally applicable from working with very young children through to working with teenagers, through to working with adults. There's something about the commonality across the sector of how we absorbed this way of thinking about knowledge and evidence into our day job. I would go so far as to argue that those Anchor Principals and again, we'll put a link in the show notes to the in-house critical thinking handbook. I would say those same principles are exactly the same if you’re writing a whole local area exploitation strategy or undertaking a strategic needs assessment. So some of the principles around “Am I going to the right sources? What are the questions I'm trying to answer? How do I check my questions are the right ones to ask? How do I stay curious? How do I kick the tyres and check that the knowledge I'm using is up to scratch? How do I change my mind if I need to and how do I demonstrate my professional values and ethics within the work I'm doing?” They're actually transferable from direct practice right through to strategic leadership. There is of course some organisational infrastructure that really helps here. Again, I'm thinking of Nutley and Davies who talked about that organisational excellent approach, where the approach to evidence is just woven into the brickwork and bloodstream of an organisation, which is what we're all aiming for. There are some really practical things that matter here. Staff having access to ideally, really high quality, synthesised research-based resources. Hello, I wouldn't be doing my job if I didn't mention that that's what Research in Practice provides. Is there a library or an intranet you can go to? Does the supervision policy for staff have embedded within it questions around the use of evidence, so that supervisors are sharing that responsibility and driving that culture of evidence-informed practice?

There's something here about the intellectual investment. Are leaders showing their workings out? Asking their questions in public, providing some of the intellectual leadership necessary to be curious, evidence-informed professionals in the local system? Then there's also the emotional investment - showing that it matters. Showing that being evidence-informed in our work is not some niche, nerdy, abstract academic thing. It's the absolute bread and butter of our moral purpose. The very least that the citizens we serve deserve is that the people supporting them know what they're up to, and professionals working in the tough context that we are now, they deserve to have a really clear framework of organisational support for practising in this way. I would say it's a whole organisation responsibility, it's not just about individual practitioners or leaders. And of course, all of this means being really open to new ideas, being prepared to be challenged, being prepared to be surprised, being prepared to be wrong. I guess that's part of the emotional resilience required. There's a lovely blog that Michael Sanders, who is the ex-Chief Exec of the What Work Centre has written for us actually, about being evidence-informed means being open to surprises. Sometimes unwelcome surprises where a thing you were really rooting for is found not to be effective. We'll make sure we share that too in the notes.

[Research in Practice resources]

Dyfrig: You've already mentioned the analysis and critical thinking handbook. What other resources do Research in Practice have in this topic area and how can people use these in practice?

Dez: With a meta answer here, most of our resources, whether you come to a conference or a webinar, or whether you're reading one of our publications. Whether it's for people in practice or people in leadership roles. They all try to embody that evidence-informed practice triad. They all seek to include, obviously, robust, relevant, legit research, but also often case studies or practice examples drawn from the sector. And very, very often, wherever ethically possible, they include expertise born out of people's lived experience. We try to model this in all that we do. A couple of really good examples of that - we've got a whole suite of multi-media resources around supporting people affected by recurrent care proceedings in a trauma informed way. Videos, blogs, publications and tools which draw directly on the experience of people who have had recurrent care proceedings for their children, as well as diverse professional knowledge across the voluntary and statutory sector. And of course, some stellar research from brilliant colleagues like Claire Mason at Lancaster. Another example is, I'm really excited about this, it's not published yet, although I've now just dated the podcast, so apologies. It might be published by the time you hear this… an evidence review that we have been writing in partnership with Social Care Future, where we really wanted to push ourselves to write with and alongside and co-author an evidence review with people who currently draw on adult social care services. Rather than, for example, writing it between practitioners and researchers and then inviting people who use services to contribute a small part.

I don't think it would be surprising enough to say it's much harder work. It's really intensive work. But the learning has been phenomenal. And in terms of living our values, wearing our values on our sleeve, it's a really exciting bit of work. We've also got a tonne of practical resources that can help organisations and individuals wanting to be more evidence-informed. There's an evidence-informed practice organisational audit tool. Now don't be put off by the word audit, it's a reflective tool where organisations can explore and assess themselves in relation to the practice, intellectual and sometimes emotional aspects needed to create that infrastructure for evidence-informed practice. We've tools and things for how to undertake group supervision, how to create the conditions in which you can talk about evidence in a natural informal way in your team. What are the firm foundations to build within teams and services, for evidence informed practice? And right through to things like podcasts like this, and webinars we've done around engaging with evidence, and I guess making it part of the day job. Like all of our work, we're trying to create connected learning pathways, stuff you can read, stuff you can watch, stuff you can listen to. Really thinking about how we try to provide multiple opportunities. So let's take a really tricky issue, vexing many colleagues across the sector right now, the issue of exploitation. We've just recently produced these national, cross government principles to guide multi-agency work at a strategic level. We also then created a practice tool that individual practitioners can use to reflect on their own practice in the context of exploitation. This compliments things like podcasts that we have with victim-survivors, and indeed siblings of people directly affected by exploitation. We've made sure that we've got a briefing for those who work with adults around exploitation and mental health, to compliment that when you're with children. I guess that's the point of Research in Practice. We're trying to help good people, doing sometimes quite tough work in very complex contexts and so on, to help people be evidence-informed in their practice. We're here to shoulder some of the burden, producing really high quality resources. Helping people to think about multiple sources of knowledge from different perspectives, to navigate what's often quite complex territory. Because very, very rarely in our world is the answer as simple as “There's a research study on that, just implement the recommendations.” That's not usual in individual practice or complex systems approaches.

So I guess for us, when we think about evidence-informed practice, when we first started out, it was quite a niche area. And even in my time, and I didn't join Research In Practice until about 2009 I think. Even then, we were still having a debate about whether evidence-informed practice was important. I think it speaks volumes about the talent of our sector that we no longer have those discussions. It's not whether, it's how. How can we demonstrate evidence-informed practice in our work? I think that's a testament to the sector really. I wholly reject any notion that the sector are evidence resistant, anti-intellectual, need to be spoon fed. In my experience, professionals at every level are hungry for knowledge, are capable of co-creating new knowledge and are curious about how best to apply that knowledge. Yes, it feels like we're in a very privileged position to be helping them with that.

Dyfrig: So all the resources that you mentioned, we'll make sure they're on our website. It was great to talk to you and delve further into what we mean when we talk about evidence-informed practice.

[Outro]

Thanks for listening to this Research in Practice podcast. We hope you've enjoyed it. Why not share with your colleagues, and let us know your thoughts on Twitter? Tweet us at ResearchIP.

Resources that are mentioned in this episode 

Research in Practice resources 

Reflective questions 

Here are reflective questions to stimulate conversation and support practice.

  1. What would an evidence-informed approach mean for your role?
  2. How can you ensure that the way that you bring evidence from lived experience into your work is not exploitative? 
  3. How can you effectively draw practice wisdom into your practice? 
  4. How might you bring learning from academic research into your practice? 

Professional Standards

PQS:KSS - Child and family assessment | Analysis, decision-making, planning and review | Organisational context | Promote and govern excellent practice | Shaping and influencing the practice system | Confident analysis and decision-making | Purposeful and effective social work | Lead and govern excellent practice | Creating a context for excellent practice | Designing a system to support effective practice | Developing excellent practitioners | Support effective decision-making | Person-centred practice | Effective assessments and outcome based support planning | Direct work with individuals and families | Supervision, critical analysis and reflection | Organisational context | Professional ethics and leadership | Influencing and governing practice excellence within the organisation and community | Promoting and supporting critical analysis and decision-making

CQC - Effective | Well-led | Responsive

PCF - Professionalism | Critical reflection and analysis | Contexts and organisations | Professional leadership

RCOT - Develop intervention