Pen and ink are better than computers in many ways

Last week Ofqual published research into the use of computers for learning and assessment.  This is what they concluded:

On-screen exams should not be permitted in most GCSE and A level subjects for the time being. Some research suggests handwriting can benefit learners in a way that is not replicated in typing because it may contribute differently to cognitive processes such as memory retention. Writing by hand should therefore remain a core part of students’ learning.

The research also shows that school and college levels of readiness to support whole cohorts of students to sit high entry on-screen exams at the same time varies significantly. We know that many do not have enough suitable digital devices, or the highly reliable IT infrastructure, facilities and support that would be needed for something as high stakes as large scale public exams on-screen.

Parents, and teachers warn of the potential negative impacts or long-term detriment to mental wellbeing and social skills from increased time spent on-screen, whether in learning or assessment preparation. Concerns included reduced time for in-person interaction, and a potential loss of interpersonal skills, especially for children with SEND.

Key research insights included evidence that students often comprehend information more effectively on paper than on screen, particularly for longer, expository texts and under time pressure.

Maths exams are a particular problem. Johnson and Green (2006) conducted a study where 104 11-year-olds answered matched sets of mathematics questions, one set on paper and the other on screen. Students were provided with scratch paper when answering on screen. The authors found that although there were relatively few transcription errors overall, they were more likely to occur when children answered on computer. They theorised that the higher number of transcription errors in the on-screen condition may be related to the “physical distance that the information needs to be carried during the processing of the problem”. More specifically, they noted that “answering on screen might require that the question is read on screen, details held in memory as attention shifts to paper to allow working to be transcribed on paper, then these details are held in memory while attention shifts back to the screen and then the answer is typed into the answer space” (page 25). In contrast, paper-based responses allow these actions to occur in close physical proximity, reducing working memory load and the likelihood of error.

Calculation and annotation facilitated (CAF) tasks are mathematical tasks that share one or both of the following characteristics:

  • they require multiple steps to reach a correct solution, and the ability to record intermediate steps (or elements of them) in close proximity to the question reduces cognitive load and supports accuracy
  • they include diagrams or visual elements, for which annotation or tactile strategies (such as counting or ‘marking off’ steps) aid performance

Evidence indicates that these tasks are easier when taken on paper than on screen. In the absence of robust, contextually relevant test data to the contrary, mode effects should be anticipated in tasks with moderate or high intrinsic load.

Threlfall et al. (2007) examined construct validity in a series of mathematics questions that were trialled on both paper and computer. Children performed considerably better on paper, and central to this was the ease of recording working beside the question. While on-screen test takers were provided with scratch paper, very few children used it.

Another strand of research has investigated marker bias. Typically, identical essays are presented to trained markers in both typed and handwritten form. These studies tend to show that typed responses are marked more harshly than handwritten responses (for example Russell and Tao, 2004a). This trend is supported by a meta-analysis of 5 studies (Graham et al., 2011). They also noted that while marker effect studies had revealed mixed results, the majority yielded an advantage for handwritten texts.

Russell and Tao (2004a) note several potential reasons for this bias, including that errors are more visible in typed responses; markers perceive handwritten responses as longer; handwritten essays appear more personal, creating a stronger sense of connection between the marker and the writer; and markers tend to hold higher expectations of quality for typed responses.

The conclusion is that in England we are keeping handwritten exams.

By Professor Barnaby Lenon, Dean of the Faculty of Education at The University of Buckingham

Bibliography

Ofqual research reports online:

*     On-screen assessment research study

  • 11 December 2025

Other research:

  • Graham, S., Harris, K., & Hebert, M. (2011). Informing writing: The benefits of formative assessment. Washington, DC: Alliance for Excellence in Education
  • Johnson, M., & Green, S. (2006). On-Line mathematics assessment: The impact of mode on performance and question answering strategies. Journal of Technology, Learning, and Assessment, 4(5), 1–34
  • Russell, M., & Tao, W. (2004a). Effects of handwriting and computer-print on composition scores: A follow-up to Powers, Fowles, Farnum, & Ramsey. Practical Assessment, Research, and Evaluation, 9(1)
  • Threlfall, J., Pool, P., Homer, M. & Swinnerton, B. (2007). Implicit Aspects of Paper and Pencil Mathematics Assessment that Come to Light through the Use of the Computer. Educational Studies in Mathematics, 66(3), 335–348

Leave a Reply

Your email address will not be published. Required fields are marked *

Create a website or blog at WordPress.com

Up ↑