Has A-level grading been a disaster?
Yes for a significant minority of pupils. The Ofqual algorithm did not use centre assessment grades for most pupils, it used historic grades. Individual pupils who are at the bottom of the rank orders, but are in fact better than their predecessors, have suffered.
On the other hand, the majority of pupils were given grades which were the same as or better than their centre assessment grades (CAGs). This is a much higher figure than grades v predictions in a normal year. And grade inflation has been constrained.
Another point is that everyone thinks it is terrible that the results ‘have been fiddled’ by a computer algorithm, not appreciating that this happens every year with the grading of exams. The days when an examiner marked a script and that mark translated directly into a grade are long gone. Because of the comparable outcomes approach to grading (each year in each specific subject the same proportion get each grade), computers fix the grades. What is more, Ofqual’s own research shows that in essay-based subjects like history and English a high proportion of candidates in a normal year get a grade which a different from that which might have been given had they had a different marker. Two experienced markers can give the same essays very different marks.
So this year some students got the wrong results, as they do every year. The crucial thing is to have a reliable appeals system.
Matthew Syed makes another good point: The fundamental problem is not grade inflation but the cultural malaise from which it springs — the manic but self-defeating desire, particularly among middle-class parents, that youngsters pass everything. (Sunday Times, 16 August)
Will the situation with GCSEs be worse than A-levels?
- Small subject cohort sizes are rare so almost all GCSEs will be graded without reference to CAGs.
- There are many more exams – nearly 5 million.
- The 3/4 grade boundary in English and maths is a life-changing distinction for hundreds of thousands of students.
Why have 40% of A-level CAGs been downgraded?
Because teachers over-predicted relative to both the normal national distribution of grades by subject and to their own school’s normal results. To prevent grade inflation, and to prevent some very optimistic schools doing unfairly well, grades had to be lowered.
Why have disadvantaged pupils suffered most from downgrading?
Because they have weaker GCSE results and this fact (‘prior attainment’) was in the algorithm. Also, disadvantaged students are more likely to be in schools with a weaker historic record of results.
Have independent schools done well out of the system?
Yes and no.
Yes, the proportion getting A/A* rose more than in other schools because independent schools have smaller cohort sizes by subject. For subject cohorts of more than 15 the CAGs were not used in the standardisation. If a school had fewer than 16 pupils taking a subject the grading algorithm involved a combination of CAGs and Ofqual’s historic data model. If fewer than 6 pupils the CAG alone was used. The reason for this was that the normal standardisation model could not be used with a small group size. So because CAGs tend to be generous, small cohorts got better results.
Why do independent schools have small subject cohort sizes? Partly because some are small schools, partly because they support small subjects which are less common in state schools – Latin, Greek, ancient history, music, further maths etc.
On the other hand, most independent school sixth forms are quite large and most of the subjects taken have more than 15 students. So these schools have suffered in exactly the same way as all schools from the impact of using historic data in the algorithm. A school which was improving over time has had no recognition of this fact.
Fewer students are at risk of being downgraded when the historical grades at their school are mostly higher grades and there is little variation in the grades awarded; this is a description of many independent schools at both GCSE and A-level.
Why did Ofqual and the DfE get things wrong?
Because the emphasis has been on ‘getting the results right nationally’ which is not the same thing at all as getting the results right for individual students. Students who go to selective schools with consistently good results do well out of the system. Those who go to schools whose results are weak or inconsistent do less well. So while the national distribution of grades is similar to that of last year, that does not mean that individuals got the right result.
Secondly, the algorithm had major flaws. We weren’t allowed to see the algorithm before results day. If we had, someone might have spotted these flaws.
Thirdly, they kept claiming that the exams this year ‘have the same currency’ as every other year. One understands why they made this claim – the results this year need credibility if they are going to give access to universities. But the truth is that they are not the same currency. No exam was sat and the main method by which grades have been awarded has been laying the teachers’ rankings of pupils in each subject on top of the historic grade distribution achieved by each school over the past two to three years. Such a system means that many students have simply not been awarded the grade they would have achieved had they sat an exam. It would have been better to tell the truth from the start which was that this system was fine for 60-70% of pupils but possibly unfair for the rest.
Finally, the appeals system has been too narrowly drawn. In order to limit the number of appeals the first iteration of the system limited appeals to ‘wrong data inputted’. As the scale of the problems became apparent, the doors have gradually been thrown open.
What should they have done?
The exams could in fact have been held, with social distancing, in the summer. But we did not know that in March.
Ofqual could have issued anonymised grades by subject to schools in June and taken appeals from schools (not students) at that point. Many injustices could have been ironed out then.
What can schools where results for individual pupils seem wrong do about it?
First, ask the relevant exam board to explain how anomalous results were determined.
Appeal on the basis that the data used was wrong – the historic data did not fairly ‘fit’ your cohort this year, subject by subject, or the data ignored the value-added score of your school.
Appeal on the basis of mock exam grades if that is permitted (see Ofqual guidance).
Will the appeals system work?
Given the political pressure it seems likely that many students will have their grades revised upwards. But if the number of appeals is huge, this might take a long time.
What is happening with university entry?
Universities over-offer on the assumption that a proportion will not meet the offer. This year many more met the offers because there has been grade inflation. So the higher tariff universities filled quickly and were less able to show clemency to those who just missed. They are less able to ‘hold’ places for students who are appealing. There is a limit to the number of students they can take dictated by the Government’s cap and by available accommodation.
This is why some students find themselves having to chose between deferring to 2021 (if they are appealing or are being offered a place for 2021 by their first-choice university) or applying to a lower tariff university. Universities like Buckingham have degrees which start in January, which could be attractive to some.
Are English students being treated unfairly compared to other parts of the UK?
Yes, the Scots have given their students the inflated CAGs as their Higher grades. In Wales and Northern Ireland they are using AS-level results to help determine A-level grades. Most students in Wales and Northern Ireland take AS-levels because, unlike in England, they contribute towards the A-level score. Few students take AS-levels in England these days.
Will many take the Autumn series of exams?
Some will, not many. If you want to be a doctor and have failed to gain a place in medical school, it would be wise to plan to sit the A-levels. Schools should give advice about how much tuition or advice they can offer. Clearly the focus should be on doing practice papers under exam conditions.
Nationally, most of those taking GCSEs transfer to a new school for sixth form so it is hard to see them getting much tuition, while A-level students have left school and may find it hard to access tuition as well.
JCQ has confirmed that GCE AS and A-level examinations start on Monday 5 October and finish on Friday 23 October. GCSE examinations start on Monday 2 November and finish on Monday 23 November.
All GCSEs and A-levels will be offered but not the Extended Project Qualification or the Advanced Extension Award.
The autumn exams series will be in the same format as the summer 2020 exams would have been had they not been cancelled.
No non-examined assessment except in art GCSE, AS and A-level where there will be a timed practical.
Students will be able to carry forward their A-level practical science and geology and GCSE English language endorsements to the autumn series.
Reviews of marking will operate as usual.
The best grade of the two achieved (summer assigned grade and autumn exam) will count.
By Barnaby Lenon
Dean of Education at the University of Buckingham