As we wind down from a relatively calm examination season – even with the introduction of new examinations this year – some of us continue to mull over the idea of ‘standards’ in examination systems.
What does the term ‘standards’ mean, anyway? It crops up everywhere in the world of assessment. In England, exam boards and Ofqual, which offer general qualifications like GCSEs and A levels, have to try to make sure that grades have the same meaning across subjects, in different years, and even between competing exam boards – a Sisyphean task that is fraught with technical challenges. This is an area in which assessment researchers like us can see our work having real impact, and there are plenty of exciting developments to shape new thinking. One of those developments is the publication on 10 September by the UCL IOE press of a new book called Exam standards: how measures and meanings differ around the world.
Standard setting in national exams is a topic of interest throughout the global assessment community, yet opportunities for information sharing are rare, given the politically sensitive nature of the subject. Just remind yourself of the claim that if one student got the wrong mark on his or her GCSE then something fundamentally was wrong with the system. This remark was made by a previous Minister of Education in England.
One of us joined that system in 1994, having come from the US where standard setting is a mysterious process owned by each state (and sometimes school district) and the other in 2014, having spent most of her working life in Scotland, which has a separate qualifications system, with contrasting standard-setting policies and approaches from England. It became clear to us and to colleagues from Oxford University and Ofqual that the opportunity to learn from other systems could be beneficial to practitioners and academics alike. Consequently, we established a major collaborative project – Setting and maintaining standards in national examinations – that aims to open conversations between international experts. Together, we are exploring how different jurisdictions tackle standard setting for their respective national, school-leaving or university entrance exams.
The project is led by a partnership of people representing some of the key organisations in England – and critically explores policy, procedures and politics (in the form of seemingly inevitable controversies) for setting exam standards internationally, drawing on the work of government, exam board and academic colleagues from around the world.
We asked experts from developing and developed countries to document how standards are defined and enacted in their education systems. We tried to gather as many different ways of setting standards as we could, but not to rank order them – this wasn’t an exercise in creating a league table of how good national exams are. It did become clear from our experiences that England’s GCSE and A level system is distinctive, and as good as any. Each system, naturally, has its own issues, and the experts captured the variety of challenges they face and responses to these within their own political and economic systems. The countries involved are: Australia (Queensland and Victoria), Chile, England, France, Georgia, Hong Kong, Ireland, South Korea, Sweden, the US (Advanced Placement Examinations™), and South Africa.
In an age of globalisation, policy borrowing and benchmarking, where governments and assessment bodies around the world look to each other to question or validate their own practice, it’s helpful to gain a deeper understanding of what examination standards mean in different political, social and economic contexts. For example, countries such as Chile, Georgia and South Africa have had to grapple with the legacies of less than equal educational opportunities for their children. Other jurisdicitions, such as Sweden and Queensland, Australia, are trying to get the balance right between teacher-led assessment and external testing.
This development of a knowledge community has been a critical outcome of the project, one that we have found to be enormously rewarding and are eager to share with others. We’ve begun to open the black box of international standard setting and would very much like to see the box opened wider, working with our global colleagues.
The UCL IOE Press book, edited by Jo-Anne Baird, Dennis Opposs and this blog’s authors, is one of the more tangible of the project’s outcomes. It includes a groundbreaking look at assessment paradigms; guidelines for conducting insider research; a look at ‘what is standard setting’; nine case studies from jurisdictions in the developed and developing world; a new theoretical conceptualisation using an ecological model; and a section on culture, context and controversy in standard setting. Its launch takes place during the autumn conference season, including at the British Educational Research Association (BERA), the International Association for Educational Assessment (IAEA) and the Association for Educational Assessment-Europe.
Next, we hope to work with this newfound knowledge community to produce a Special Issue of the journal Assessment in Education: Principles, Policy & Practice. Throughout this work, it has become clear that despite differences between our systems, most governments and their exam boards face similar pressures and challenges. Senior school examinations shape students’ future life chances, and the more deeply we collaborate, the more we appreciate how vital it is to share our knowledge on how we set and maintain standards in those examinations. It is a privilege to be part of this important project.
Dr Tina Isaacs is Honorary Associate Professor, UCL Institute of Education and Dr Lena Gray is Director of Research, Centre for Education Research and Practice (CERP), AQA.
 ‘Overcoming political and organisational barriers to international practitioner collaboration on national examination research: Guidelines for insider researchers working in exam boards and other public organisations’, Oxford University Centre for Educational Assessment Report OUCEA/17/2.