Lynch, S. (2022). Adapting paper-based tests for computer administration: Lessons learned from 30 years of mode effects studies in education . Practical Assessment, Research, and Evaluation , 27 (1), article 22. https://scholarworks.umass.edu/pare/
[no doi reported]; also available online at journal webpage: https://scholarworks.umass.edu/pare/vol27/iss1/22/ and at ERIC webpage: https://eric.ed.gov/?id=EJ1359345
Accommodations were not specified. Researchers examined the increased delivery of computer-based tests (CBT) over paper-based tests (PBT) and the reliability and validity of score interpretations when the test mode is changed.
Literature was reviewed to investigate trends from comparability studies on PBTs and CBTs in education over the last 30 years. Computer adaptive tests were not discussed in this study, rather it focused on peer-reviewed studies that used experimental, quasi-experimental, and mixed methods designs.
A number of factors were identified as affecting how examinees interact with an item on a PBT versus a CBT and how these have the potential to result in different scores. According to the review, these mode effects from the research can be placed in the following categories: test navigation and layout, item characteristics, cognitive processes, scoring, and student characteristics.
Researchers concluded that it is misguided to assume that examinee performance can be compared across modes. Therefore, research is needed to support the validity of score interpretations when tests are offered in both CBT and PBT modes or when scores from CPTs are adapted from PBTs to be compared.