How can research truly inform practice? It takes a lot more than just providing information

Jonathan Sharples. 

The Education Endowment Foundation’s latest evaluation report, the ‘Literacy Octopus‘, provides plenty of food for thought for anyone interested in improving the way research evidence informs practice, not just in education, but across sectors.

This pair of large, multi-armed trials evaluated different ways of engaging schools with a range of evidence-based resources and events. The common focus was on supporting literacy teaching and learning in primary schools.

The findings make it clear that our notion of ‘research use’ needs to extend beyond just communicating evidence – for example, publishing a report online – to looking at how it is effectively transformed and applied to practice. This message is particularly sobering, given that basic communication strategies still make up the majority of organisations’ efforts to mobilise research evidence, despite those organisations being aware of the limitations. This applies to all sectors, not just education.

The first trial tested whether simply sending schools evidence-based resources in a range of formats could have an impact on literacy outcomes – this included printed research summaries, practice guides, webinars and an online database. The second trial tested whether combining these resources with additional support to engage with them would have greater impact.

In total, more than 13,000 schools were involved in these two ‘Literacy Octopus’ trials. Some schools were just sent evidence-based resources; while others received the resources along with additional light-touch support, such as invitations to twilight seminars. By testing different ways of engaging schools with the same evidence, the intention was to compare ‘passive’ and ‘active’ forms of research dissemination.

In what are some of the largest randomised controlled trials (RCTs) ever conducted in education, the evaluators, the National Foundation for Educational Research, found that none of the approaches had an impact on pupil attainment, nor on the likelihood of teachers using research to inform their practice.

The findings of the ‘dissemination’ trial, where schools were simply sent evidence-based resources, are perhaps not surprising. There has been a growing recognition over the last 20 years that simply ‘packaging and posting’ research is unlikely, by itself, to have a significant impact on decision-making and behaviours. These findings add further weight to this understanding, although in the form of much-needed empirical research.

So why didn’t the second ‘Literacy Octopus’ trial, which tested the impact of providing schools with some additional support to engage with the evidence-based resources, have more success?

A recent systematic review, published by my colleagues at the IOE’s EPPI Centre, sheds some useful light on what might be going on. This review looked at six mechanisms that underpin a range of ‘knowledge mobilisation’ interventions (for example, creating access to research and developing skills to use research) and how they affect decision-making. Importantly, they also looked at the behaviours that were necessary for those approaches to have an impact. This included having:

1. Opportunities to engage with the interventions,
2. The motivations to do so, and
3. The skills and capabilities to understand and use the outputs.

Crucially, across all the different types of research-use interventions they found that impacting on decision-making relied on also attending to these behavioural needs. For example, interventions that create access to research only appear to impact on decision-making if they are also combined with strategies that create opportunities and motivation for engaging with that evidence. Interventions that focus on building people’s skills to use evidence, for example through training and professional development, are conditional on also having the capabilities to act on it.

Furthermore, it is often the use of multiple strategies – as opposed to single ones – that influence decision-making, particularly where these approaches are embedded in existing structures and processes (e.g. school improvement or policy systems).

In light of these insights, the interventions in what we termed the ‘active’ arms of the Literacy Octopus appear actually to be light touch. To what extent, for example, can attending a conference create the opportunities, motivations and skills to be able to do something with the evidence that was being presented? What further support, capacity, and conditions are needed for that evidence to gain traction on classroom and school improvement?

A range of evaluations funded by the Education Endowment Foundation over the last few years illustrate a similar trend: that just exposing teachers to information about evidence-based practices is rarely sufficient in itself to improve teaching and learning, even if that information is underpinned by robust research.

If we look at projects that do show promise, they often provide careful scaffolds and supports to help apply conceptual understandings to practical classroom behaviours and specific subject domains. Indeed, schools in the ‘Literacy Octopus’ trials that did change their practice using the evidence that was presented, often appeared to do so through structured in-school collaboration and enquiry.

We take away three key lessons:

  1. Traditional communication and dissemination of research should be seen as just one strand of a multi-faceted approach to mobilising knowledge. It should be seen as a foundation for further activities, rather than a means to research use in itself.
  2. Projects and interventions that encourage engagement with research need to provide better support for translation and adoption back in the school. For example, a growing body of evidence demonstrates the benefits of in-school coaching and mentoring in supporting changes in classroom behaviours – we should explore how these and other activities can be woven into projects that support research use in schools.
  3. We should continue to help build the general capacity and skills in the sector to use research as part of school improvement. This includes developing resources and processes to support evidence-informed school improvement, as well as working with regional and national policy makers.

 

For more information, videos and supporting resources, please visit: https://educationendowmentfoundation.org.uk/

Jonathan Sharples is senior researcher at the EEF and professorial research associate at the IOE’s EPPI-Centre

 

Tagged with: , , ,
Posted in Evidence-based policy, Language and literacy, Leadership and management, Research matters, Teaching, learning, curriculum & assessment
4 comments on “How can research truly inform practice? It takes a lot more than just providing information
  1. geraldine461 says:

    My own practice as an Educational Psychologist has been influenced by a paper by Marzano et al that I read over twenty years ago. I have made their model my own: that in order to adopt a new practice, the individual needs to first identify that the practice it is to replace is bankrupt or redundant. This step – of consigning redundant practices to the ‘trash’- leaves a vacuum that makes adoption of an alternative approach more likely. Definitely one of my ‘desert island’ papers.

    Marzano, R. J., Zaffron, S., Zraik, L., Robbins, S. L., & Yoon, L. (1995). A New Paradigm for Educational Change. Education, 116(2), 162–173.

  2. educationstate says:

    By coming clean about its limitations?

    Education research, especially the ‘What Works’ kind, exaggerates its powers. A good example of this is the claim that research “shows” something. That type of word inflates what research actually does which is to “suggest”. Education research is always probabilistic, after all.

    Until education research is more open and public about its underlying weaknesses, it will likely continue to be seen as a politically and institutionally self-interested imposition, and not seen only as something that can help practice.

    Teachers are not delivery agents, ultimately, they are intellectuals. Education research seems to overlook this.

  3. […] EEF recently funded two large research trials involving 13,323 English primary schools to test the effectiveness of some of the commonly used ways of disseminating research evidence (see IOE […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

UCL Institute of Education

This blog is written by academics at the UCL Institute of Education.

Our blog is for anyone interested in current issues in education and related social sciences.
@IOE_London
Keep up with the latest IOE research

Enter your email address and we'll let you know when a new post is published

Join 33,087 other followers