I was recently complaining bitterly to friends about the refusal of government to actually view any research evidence before embarking on some huge innovation which will disrupt the lives of teachers, parents and children without the slightest idea if it is actually going to work. I was surprised when a highly intelligent, well-informed friend simply shrugged off the idea of evaluations in education, particularly comparative studies. “Too complicated, can’t be done…”
That brought me up short. I admit that much educational research leaves me a bit queasy; I used to squirm when speakers confidently proclaimed that ‘on average’ people with Level 2 numeracy earn £5,000 a year more than those with lower qualifications. While this may well be ‘true’, there is no causal link; if you pass GCSE maths no-one is going to come along and give you a pay rise – though as a result of this you may, in time, achieve higher pay than you would have done otherwise (though again, that is impossible to prove). Untangling the complexities of lives is a constant problem for all involved in social science research.
Comparison studies are complex in a different way. In the research into the impact of curricula in nine ‘high achieving’ jurisdictions, led by Dr Tina Isaacs and funded by the National Centre for Entrepreneurship in Education (NCEE), our first problem was with the phrase ‘high achieving’; simply using PISA as a definition wouldn’t have been our approach, but as we were given these jurisdictions as part of our starting criteria, and they clearly were (at least) fairly high achieving, we went along with this premise. The study was a paper-based exercise and initially we worried about the lack of field work to triangulate the policy documents and academic papers. However, when I spoke with colleagues who specialise in working in other cultures and countries it became clear that in the end you go where the government allows you to go, talk to the people who are set up for you and so receive the impression the authorities want you to. To do true independent research in another country or culture, it would probably require years of living in that society to be able to navigate it independently.
In the impact of curricula study we were largely led by government policy, but did attempt to get alternative perspectives. For instance, the educational approach in Japan is significantly less focused on high-stakes testing than most South Asian countries, and has been moving towards a policy of Yutori or relaxed learning. But government policy ignores the cultural tendency towards tutoring and out-of-school learning activities. Some estimates suggest that up to 70% of Japanese lower secondary students attend out of school Juku or Cram schools.
In terms of the wider picture, Tina and I did not expect to find a great deal of evidence on the curriculum having a direct bearing on achievement, and this is exactly what we found. In my experience, social scientists are generally very unhappy about negative results. Perhaps because I was initially educated as physicist I fully appreciate the importance of negatives; so the fact that a student in lower secondary school receives just over 4000 hours of education in Australia but only 2,500 hours in Finland without any discernible impact on achievement seems to me quite relevant. Similarly, a student in Year 4 receives 220 hours of maths teaching in Singapore but only 130 hours in Japan, again without apparent impact in PISA results; this is, to me, a respectable finding. It illustrates that when Governments perch in their seats of power and micromanage the minutes each school spends on a currently fashionable subject it is, as we may well have assumed, likely to be a waste of time.
Where we did find correlations, they were not dramatic perhaps, but important. We found that all the high performing jurisdictions promote 21st century skills, have national curriculum guidelines that allow for local interpretation but hold the standards constant, and all but one offer a comprehensive core curriculum for all students through lower secondary school. But overall instructional system patterns varied across the jurisdictions.
The new special edition of Curriculum Journal, which Tina and I edited, shows that our study was really a starting point for others. The issue was divided into articles looking more closely at subject areas (maths, science, language and social studies) and a more in depth analysis of individual jurisdictions and how they compared with our findings (Ontario, Japan, Queensland and England). The variety and richness of these articles shows that for all the theoretical difficulties in conducting comparative studies, the field has sufficient robustness to allow researchers to uncover interesting and important messages which can, given all their cultural baggage, be usefully incorporated in domestic systems.
No research is without its drawbacks and caveats; but social science research can give us useful and important ideas and understandings, and so too can comparative studies. Just don’t ask us to define what a ‘high achieving’ system is.