Research changes lives. Somewhere along the way, every research project involves a question which is about making a difference to the way people think, behave or interact. In applied social science – the field in which the IOE excels – research sets out to understand, shape and change the world around us.
The idea of research ‘impact’ has become important in research management and funding. Universities are now held to account for the impact of their research, not least through the Research Excellence Framework. But ideas about ‘impact’ and how to secure it vary.
In popular accounts of science, research is sometimes described as ‘earth shattering’, as if it creates something like a meteorite crater reshaping the landscape for ever. An example might be Frederick Sanger’s development of rapid DNA sequencing in the 1970s, which has transformed practices across any number of fields.
But there are other images to describe impact. Not all research has what the LSE research team looking at impact call an ‘auditable’ consequence. They comment that research applied in practice is “always routinized and simplified in use” so that over time, the impact fades like ripples in a pond.
The University of the Arts talks of its research as having creative outcomes that enhance cultural life and provides examples of artworks which create possibilities and shift perceptions: ideas which float into the air like dandelion seeds.
The impact of some research is apparent quickly – though almost never as rapidly as the tabloid newspapers which almost weekly trumpet miracle breakthroughs would have us believe – whereas in other cases it can take decades before the value of research becomes apparent.
Not only does the IOE itself undertake research which seeks to have an impact, it’s also interested in understanding what impact looks like, what it means and how it happens. At a recent conference we explored the linked questions of research impact and public engagement: the relationships between research, policy, practice and improvement, are things some of my colleagues try to understand.
The ESRC defines research impact as “the demonstrable contribution that excellent research makes to society and the economy“. This suggests three components: the demonstrable nature of the contribution, the quality of the research, and the focus on both society and the economy. Successful impact means holding all three in creative relationship: without any of them, the other two are diminished. Research which is not excellent will not make a demonstrable contribution; research which sits unpublished and unread will not, whatever its methodological sophistication, make a demonstrable contribution and so on.
Understandings of impact – or of knowledge mobilisation and its dynamics – have been transformed over the last fifteen years, as the barriers to, and strategies for making creative use of research to impact more directly on people’s lives have become clearer and ways of engaging the public in the dynamics of research have developed. No research – however excellent – ever simply has an ‘impact’.
Richard Doll discovered that smoking caused lung cancer in the 1950s, but it took several years and active public health campaigns to change behaviour. In education, the gap between, say, research on assessment for learning (AfL) and AfL practice suggests that – like the idea of the droplet making ripples on a pond – the impact of research can quickly dissipate unless something active is done.
Research always needs mediating – or, to put it differently, research impact needs a plan. Academics used to talk about ‘dissemination’, but thinking has moved far beyond such models – “I research, you listen” – to more creative and nuanced understanding of the ways knowledge moves – and does not move – around organisations and society. We have learnt that while these relationships are complex, they can be managed effectively.
In the early days of work on research impact, thinking focused on ‘what works’, on the assumption that research could tell us what techniques have maximum effectiveness, and that this could in some way be taken to scale by more or less sophisticated dissemination techniques. We have become – individually, institutionally, collectively – more sophisticated than that, and we have done so quickly. We know that ‘how it works’ and ‘why it works’ are just as important and that the effort to link the worlds of research, policy and practice involve commitment and engagement from all parties. In Ontario, the Knowledge Network for Applied Education Research tries to draw key groups together to enhance knowledge mobilisation. Bringing about change in practices is never easy, as anyone who has ever tried to lose weight, get fitter or learn a new language knows.
There’s a nice social enterprise quotation: “success does not matter. Impact does”. The IOE is a socially engaged university. We care about the quality of the research we undertake, and we make sure that it is of the highest quality. But we care equally about the way our research shapes the society it is seeking to understand. We understand that research evidence will always be only one of the factors that influences society, and that other factors always intervene. But we also know that progress has been made in the past in this field and more can be made in future with persistent effort.
For us, ‘impact’ is not an artefact of the 2014 REF, nor an obligatory hoop through which to jump. There is a wonderful line from the 2008 strategic plan for Carnegie Mellon University – and very, very few university strategic plans contain quotable lines. But in 2008 Carnegie Mellon got it right: “we measure excellence by the impact we have on making the world better”.