PISA (the Programme for International Student Assessment) is funded by the OECD (Organisation for Economic Co-operation and Development). The programme started in 2000 with tests happening every three years. PISA is the most rigorous project ever undertaken to assess what makes schooling effective.
PISA tests are computer-based, administered to a sample of 15-year-olds in each country and cover reading, science and mathematics; 15-year-olds are chosen because at this age most children in most OECD countries are reaching the end of compulsory education. The tests are not directly linked to the school curriculum. Additional questions are asked to discover something about the schools the pupils go to, their socio-economic background and their attitude to school.
In each test subject, there is theoretically no minimum or maximum score in PISA; rather, the results are scaled to fit approximately normal distributions, with means for OECD countries around 500 score points and standard deviations around 100 score points. About two-thirds of students across OECD countries score between 400 and 600 points. Less than 2% of students, on average across OECD countries, reach scores above 700 points, and at most a handful of students in the PISA sample for any country reach scores above 800 points. The OECD equate 30 PISA test points to a year of additional schooling.
PISA in England is administered by NFER, the National Foundation for Educational Research. Schools which are not chosen in the sample can pay to do the PISA test as ‘extra’ schools. All schools are given feedback about their results.
Worldwide some 600 000 students completed the assessment in 2018, representing about 32 million 15-yearolds in the schools of the 79 participating countries and economies. In the United Kingdom, 13, 818 students, in 538 schools, completed the assessment, representing 597, 240 of the 15-year-old students (85% of the total population of 15-year-olds).
Small differences in scores between countries have no statistical significance. Trends are only significant if they happen over three PISA cycles because every time there is a PISA test there are changes in methodology (such as going over from paper to computer-based testing) and this produces changes in scores.
1 Andreas Schleicher, the OECD’s education director, said there were “positive signals” from the UK’s results for the tests taken in 2018 – which he said showed “modest improvements”.
Maths 27th 18th
Science 15 14
Reading 22 14
Places with statistically significantly higher scores than England (only) 2018
Four Chinese cities, Canada, Singapore, Macao (China), Hong Kong (China), S. Korea, Estonia, Japan, Taiwan, Netherlands, Switzerland, Poland
Four Chinese cities, Canada, Singapore, Finland, Macao (China), Hong Kong (China), S. Korea, Estonia, Japan, Taiwan
Four Chinese cities, Canada, Singapore, Finland, Macao (China), Republic of Ireland, Hong Kong (China), S. Korea, Estonia
2 England is ahead of the rest of the UK on all measures. The 2010-2015 coalition reforms, which only happened in England, may have helped – although it takes at least ten years to be at all certain. The national numeracy strategy, introduced in primary schools after 1999, may have helped too.
Wales is the weakest. Former education minister Leighton Andrews said the Welsh Government may have taken its “eye off the ball” in the 2000s when it scrapped testing and league tables.
The PISA report into the 2015 tests found only 6% of the variation in student performance in Wales could be put down to their socio-economic background.
England 493 504
Wales 480 487
Scotland 490 489
N Ireland 493 492
In mathematics, both England and Wales show an improving trend across successive PISA cycles, while Scotland has declined and Northern Ireland has remained broadly stable.
England’s mean score in mathematics was significantly higher than in PISA 2015, which is the first time performance has improved after a stable picture in all previous cycles of PISA. England’s average score was also significantly higher than the OECD average.
England 512 507
Wales 485 488
Scotland 497 490
N Ireland 500 491
In science there has been a decline in performance over successive cycles of PISA in Scotland, Wales and Northern Ireland, all of which had mean scores that were significantly lower than those in PISA 2006.
In England, the gap between high and low achievers in science was significantly larger than the OECD average, with a larger proportion of pupils in England performing at the highest proficiency levels.
England 500 505
Wales 471 483 well below the rest
Scotland 493 504
N Ireland 497 501
All countries of the UK show a stable trend in reading, apart from a significant improvement in Scotland since PISA 2015, which followed a similarly sized decrease between 2012 and 2015.
In common with all other participating countries, girls in England outperformed boys in reading. However, the gender gap in England was significantly smaller than the average gap across the OECD.
3 The UK is good for equity: disadvantaged pupils and the children of immigrants do better than in many countries. For example, 14% of disadvantaged students scored in the top quarter of performance in reading, indicating that disadvantage is not destiny. Socio-economic status only explained 12% of the variation in mathematics performance in PISA 2018 in the United Kingdom (compared to 14% on average across OECD countries), and 11% of the variation in science performance (compared to the OECD average of 13% of the variation).
Although private schools tend to be more selective, which contributes to social segregation in the school system, most of the social segregation across schools comes from within the public sector rather than from social segregation between public and private schools.
4 The proportion of students with an immigrant background in the United Kingdom increased between 2009 and 2018, from 11% to 20%. One in three of these immigrant students was socio-economically disadvantaged. Students without an immigrant background outperformed their immigrant peers by four points in reading, after accounting for students’ and schools’ socio-economic profile. However, in spite of their relative socio-economic disadvantage, 21% of immigrant students scored in the top quarter of reading performance – 4 percentage points larger than the OECD average (17%).
5 When you look at variations in pupil performance within and between schools, the UK has a low level of variation between schools (ie most schools do equally well) but a high level of variation within schools (there is a wide range of outcomes, some pupils doing much better than others).
6 It’s not about spending on education. Australia, the United Kingdom and the United States all spend more than USD 107 000 per student from age 6 to 15, yet scored no better than (and in some cases, below) Canada, Ireland and New Zealand, all of which spend between 10% and 30% less.
7 Behaviour is a problem. Some 25% of students in the United Kingdom (OECD average: 26%) reported that, in every or most lessons their teacher has to wait a long time for students to quieten down. In the United Kingdom, students who reported that, in every or most lessons, the teacher has to wait a long time for students to quieten down scored 30 score points lower in reading than students who reported that this never happens or happens only in some lessons, after accounting for socio-economic status.
8 Mental health is a problem. The UK had some of the lowest scores of any country for “life satisfaction” and for feeling they had “meaning” in their lives. In England, 66% of young people said they were sometimes or always worried – compared with an OECD average of 50%.
In the United Kingdom, 27% of students reported being bullied at least a few times a month, compared to 23% on average across OECD countries.
Problems with PISA
It is naïve to assume that Asian countries do well simply because of something that happens in their classrooms. It could well be their wider culture which is responsible for the academic success of their pupils. So requiring British schools to adopt the classroom methods of Asia is what social scientists call a ‘category error’. Much of the success of Singapore could be due to the prevalence of out-of-school tutoring.
In some countries the PISA tests have higher status than others and these countries have adapted their curriculum to try and boost their PISA scores. Their success in the PISA scores does not mean they have ‘the best’ schools.
Many of the jurisdictions which do best are small and socially homogeneous: Hong Kong, Singapore, Finland, Estonia (1.3 million), Taiwan. It is wrong to compare these with large and much more ethnically diverse countries. The stellar results for China are in fact based on just four cities. If we only took the scores for London as representing the whole UK we would have a very different picture.
PISA publishes huge numbers of reports based on correlations between PISA scores and other variables. For example, PISA claimed that high-scoring jurisdictions gave schools greater autonomy (and this influenced the current UK government). But we all know that correlations do not prove cause and effect – Janet Ouston (1999) showed that the number of pot plants in a school correlated well with exam results. Some of PISA’s conclusions may be misleading.
And you have to read the fine print to know that PISA rankings have a big margin of error. A country might in truth be ‘between 19th and 27th’ but will always be reported in the newspapers as being 23rd.
Professor Barnaby Lenon, Dean of Education