To avoid a repeat of last summer’s exam chaos, the government must decide quickly on alternative assessment measures. There is a strong case for A-level students to receive in-class testing – with flexible timing and content – to take account of differences in their learning experiences.
While the uncertainties of a global pandemic make this one of the most volatile periods of education policy in history, if there is one lesson we should all have learned since last March, it is that indecision is costly. This has proved true repeatedly for public health and looks just as relevant for education.
As we saw with last summer’s exam fiasco, the failure to act decisively led to there being little alternative but to assign school students’ grades based on teachers’ predictions of what they would have achieved. This sub-optimal situation removed any final contribution on the part of the students and, more importantly, resulted in significant biases across school type and family background.
Of course, back in the summer of 2020, the government had little time for the advance planning that any alternatives would have required. But with GCSE and A-level exams now cancelled, the government must act quickly to provide clarity on both if and exactly how students are going to be assessed this year, so that we don’t have a repeat performance in the summer of 2021.
Should exams have been cancelled?
Certainly, the decision to cancel exams in their usual format was the right one. For exams to give all school students the same chance to succeed, one of the pre-requisites is that they have had the same amount of time to prepare. But we know that is not the case from looking at the patterns of disruption to their studies. While both of the exam cohorts (year 11 and year 13) missed up to five months of schooling in the academic year 2019/20, the disruption has continued during this crucial exam year and in a much less uniform manner.
Unfortunately, England does not publish data on attendance rates by year group, but we can look more broadly at attendance rates in all state-funded schools by region over the autumn term. Figure 1 illustrates that while attendance rates started the academic year at between 85% and 95%, by mid-November, we were seeing rates substantially below this (falling from 88% to 83% on average) driven by widespread – but regionally varying – self-isolation by both individual students and education ‘bubbles’.
In mid-November, attendance rates were lowest in the North West and Yorkshire. By mid-December, with what we now understand to be the prevalence of the new variant increasing, London, the East and the South East had all seen stark declines in their attendance rates. In contrast, the South West has remained near the top of attendance rates throughout.
Figure 1: Weekly attendance in state-funded schools by region, 10 September 2020 to 10 December 2020
Source: Gov UK
Even with remote learning, this is unlikely to level the playing field. In a previous Economics Observatory post from November, we highlighted several pieces of evidence from the first lockdown (for example, Andrew et al, 2020 and Anders et al, 2020) that children in better-off families spent more time on home learning and were much more likely to have benefitted from online classes than those from poorer backgrounds.
Much of this reflects differences in resources for home learning, such as laptops and fast internet connections. While schools and homes may be better set up for remote learning this time around, many of these inequalities are entrenched and likely to remain.
One of the most immediate concerns with the lack of clarity about this year is that current year 11 and year 13 students might switch off, thinking that they no longer have to take exams. Indeed, some teachers report that this type of disengagement may be as much a risk to students’ learning as the more widely discussed issues of unequal access to remote learning devices and the internet (National Foundation for Educational Research, 2020).
While the prime minister’s announcement of the new lockdown was very clear that exams were cancelled, what was far less clear was what any replacement assessment might look like.
So what now?
The government have promised that they will ‘work together with Ofqual to put in place alternative arrangements’ for assessment this summer. So, what should these alternative arrangements look like?
A-levels, BTECs and GCSEs
While discussions in the media suggest that the approach this year might rely on teacher assessment, the evidence clearly shows that we should actively avoid this approach to assigning grades for A-level and GCSE students. Such an approach will call into question both the rigour and fairness of the awarded grades.
We argue that externally set and marked tests remain the most rigorous and fairest option for all students taking terminal exams. But these do not need to take place in the typical format, during a three-week period in June.
Instead, there is a strong case for more flexible timing for testing students, spreading exams across the summer term and, crucially, allowing students to sit these exams at different times to deal with any continuing need for closures during this period. While would involve more work for exam boards given the need to provide multiple versions of each exam, it is the fairest way to ensure that students do not miss out on external assessments. The fact that it requires more work only underlines the need for swift action.
Further, we should ensure that such exams include flexible content. This would help to reduce the unfairness caused by the fact that different schools will have been able to cover different content through interruptions to in-person schooling. These reformed exams would be more like university finals: students could be given a wider set of options and asked to answer a smaller proportion of them – for example, two questions from six alternatives covering a wide sweep of the curriculum.
This approach would have substantial similarities with what has already been announced in Wales, also supporting fairness for university applications between applicants from the two nations. It would ensure that students can still be awarded grades that they have earned, while providing robust information on achievement for universities and future employers.
In contrast, exams in Scotland have been cancelled altogether – with the ‘national 5s’ (the equivalent of GCSEs) cancelled several months ago and highers (Scotland’s A-level equivalent) cancelled just before Christmas. Scotland will instead base awards on teachers’ judgement. But while announcing this well in advance gives schools more time for continuous assessment, this is not a system that should be replicated in England.
The evidence is clear here: the use of teacher assessment puts additional pressure on already-strained teachers, and will lead to inflated grades and inequalities. This devalues any awarded qualifications. It will be costly for those who go on to university, who may not be treated in the same way in the admissions process. But it will even more costly for those for whom it will be their final qualification, one that employers are likely to view with scepticism for the foreseeable future.
Under current proposals, young people studying for BTECs and other vocational and technical qualifications will continue to take their January exams from this week. This raises some obvious questions about how it will be possible when the country is back in lockdown, and how fair it is to expect them to perform as usual in highly unusual circumstances.
It also means that it may not be possible to compare BTEC results with their A-level equivalents, since students will have been assessed in very different formats. Given that BTEC qualifications are used alongside A-levels in university applications, this will have serious implications for fairness in applications.
Whatever route is taken with assigning grades this year, a serious issue that cannot be ignored is that this cohort will know less than previous cohorts, which will have important implications for future productivity across the board:
- For those going on to further education, universities and colleges will have to adjust their courses to account for significant learning losses. This will take time and resources.
- For those taking vocational and technical qualifications, the challenge is even starker: many students will require further practical training before they are at the standard required to apply their trade.
Primary school assessment
While key stage 1 tests have been suspended for 2021, current plans are for key stage 2 tests to go ahead, although the school level results will not be published. Given that key stage 2 tests are primarily used as indicators of school performance, which are going to be measured with substantial error this year, there are serious questions about their value to bodies such as Ofsted, the education regulator, with which they are still proposed to be shared for accountability purposes.
There is a strong case for abandoning key stage 2 assessment of any kind given the current circumstances. This would significantly reduce the burden on primary school teachers, who are working under very difficult conditions, and would remove associated stress on children and parents.
University admissions
The cancellation of this summer’s exams obviously has serious implications for all young people, including those planning on entering university this coming autumn.
As discussed in a previous Economics Observatory post, students in the UK apply to courses before they sit their A-level exams and equivalents, using predicted grades. This year’s university application deadline for 2021 entry is 15 January (although for some courses and institutions it was earlier), which means that the exam cancellation could affect the application decisions of a subset of school students. But since the government have said that they will ‘put in place alternative arrangements that will allow students to progress fairly’, it is not clear that young people would benefit from second-guessing their choices.
This only serves to underline the flawed nature of our current university applications system. Requiring young people to apply to courses before they sit their exams and learn their grades is deeply problematic in any year, but even more so under current circumstances, where they do not even know how they will be assessed.
What we do know for sure is that young people from each home nation, all treated equally as UK students for the purpose of university applications, will all be assessed differently. It may also put even more reliance on predicted grades, which have been shown to be inaccurate.
While it is likely to be too late for young people entering university later this year, it is increasingly clear that we should move to a system of post-qualification admissions for next year. Again, the government needs to act swiftly to put in place such changes before the start of the next admissions cycle in September.
The need for swift action
The longer it takes for the government to make a clear decision on how school students will be assessed this year, the harder it will be for them to be implemented, until the point when they are no longer feasible. At that point, there is a major risk of a repeat of last year’s exam fiasco – but without the excuse of not having had time to prepare a better alternative.
Where can I find out more?
- Minority report: the impact of predicted grades on university admissions of disadvantaged groups: Richard Murphy and Gill Wyness examine A-level prediction accuracy by student and school type.
- Test scores, subjective assessment, and stereotyping of ethnic minorities: Simon Burgess and Ellen Greaves examine differences in teacher assessment compared to externally marked tests.
- Post-qualification application: a student-centred model for higher education admissions in England, Northern Ireland and Wales: Graeme Atherton (NEON) and Angela Nartey (UCU) present a model for post-qualification applications.
- Impact of school closures on the attainment gap: rapid evidence assessment: Education Endowment Foundation report.
Who are experts on this question?
- Gill Wyness, Centre for Education Policy and Equalising Opportunities, UCL, London
- Lindsey Macmillan, Centre for Education Policy and Equalising Opportunities, UCL, London
- Jake Anders, Centre for Education Policy and Equalising Opportunities, UCL, London
- Simon Burgess, University of Bristol, Bristol