Prisacari, A. A., & Danielson, J. (2017). Rethinking testing mode: Should I offer my next chemistry test on paper or computer ? Computers & Education , 106 , 1–12. https://doi.org/10.1016/j.compedu.2016.11.008

Journal Article
Prisacari, A. A., & Danielson, J. (2017). Rethinking testing mode: Should I offer my next chemistry test on paper or computer? Computers & Education, 106, 1–12. https://doi.org/10.1016/j.compedu.2016.11.008

Tags

Electronic administration; No disability; Postsecondary; Science; U.S. context

Summary

Accommodation

Performance on computer-based versus paper-based postsecondary chemistry course examinations was compared. [This study had the same participants as its companion investigation—Prisacari & Danielson (2017) Computer-based versus paper-based testing—in which cognitive load and scratch paper use factors were analyzed.]

Participants

Postsecondary students, all undergraduates (n=221), taking a general chemistry course at a university in a state in the Midwest (U.S.) participated. Demographic information such as age, gender (57% were female, 43% were male students), and race/ethnicity were reported. No disability information was reported, and no comparisons were made between samples of students by disability category.

Dependent Variable

Academic test performance in a general chemistry course was measured with three tests—two quizzes and one practice exam. The forms of each test were equivalent in content and difficulty level through a systematic item pairing process. Six sets of items, with three different item types, were implemented across the tests, and compared by type: algorithmic, conceptual, and definitional questions. Each test consisted of a mix of test questions, calling for either selected response (multiple-choice) or short-answer (open-ended). Students were randomly assigned to condition 1 (test 1 on computer and test 2 on paper) or condition 2 (test 1 on paper and test 2 on computer). Students then registered for one of two slot to take the practice exam without knowing whether it was on computer or paper.

Findings

Taking into account knowledge gains between quizzes and practice test phases, no significant differences or response patterns were found between the mean performance scores of the tests by test mode, paper-based and computer-based administration. Comparisons by item type—algorithmic, conceptual, and definitional—also found no significant performance differences between the test modes. The researchers concluded that implementing computer-based testing for postsecondary students did not introduce any performance differences; students have anecdotally indicated that a benefit of computer-based tests has been more immediate feedback on academic grades due to automated scoring.