Smolinsky, L., Marx, B. D., Olafsson, G., & Ma, Y. A. (2020). Computer-based and paper-and-pencil tests: A study in calculus for STEM majors . Journal of Educational Computing Research , 58 (7), 1256–1278. https://doi.org/10.1177/0735633120930235

Journal Article

Smolinsky, L., Marx, B. D., Olafsson, G., & Ma, Y. A. (2020). Computer-based and paper-and-pencil tests: A study in calculus for STEM majors. Journal of Educational Computing Research, 58(7), 1256–1278. https://doi.org/10.1177/0735633120930235

Tags

Electronic administration; Math; No disability; Postsecondary; U.S. context

URL

https://journals.sagepub.com/home/jec

Summary

Accommodation

Accommodations were not investigated per se; however, testing conditions were compared between exams administered in paper-and-pencil and computer-based modes. An important difference between these test modes was in how they were graded. Traditional mathematics paper-and-pencil exams have called for item responses to contain application of logical argument within solving processes, including written calculations to find the answer; "showing one's work" provides clarity of examinees' understanding, and has typically been granted partial credit even if final results were inaccurate. In contrast, math computer-based exams only have final item responses. Computer-based exams provided immediate feedback to test-takers.

Participants

A total of 324 postsecondary engineering students from Louisiana State University participated. Demographic characteristics were reported including race/ethnicity, gender, and Pell grant status (as a proxy for socioeconomic status); other information such as ACT scores in mathematics were collected. These additional data were used in analyses, including as covariates. No disability information was reported nor was disability used as a systematic factor for comparison.

Dependent Variable

Four testing periods for each academic term of the science-technology-engineering-mathematics (STEM) course in Calculus II meant that there were four performance scores collected for each participant. The testing design involved three data collection options: Section 1 students completed exams only in paper-and-pencil mode, Section 2 students completed two computer-based and two paper-and-pencil exams, and Section 3 completed exams only in computer-based mode. was presented in three formats: (a) on-campus, with in-person attendance) course or as an online-only (remote attendance) course. Academic content and process were identical: students were given the same lectures, assignments, sample tests, sample problems, and due dates regardless of exam format, randomly assigned by student. Analyses were intended to determine whether testing modes measured knowledge and skills in a consistent matter. Further, researchers sought to answer how students' knowledge and skills gained in a computer-based class compared to the knowledge and skills gained in a class requiring paper-and-pencil tests.

Findings

Performance on computer-based exams was consistent with paper-and-pencil exams in the Calculus II postsecondary course. tests. Researchers concluded that courses using paper-and-pencil tests yielded better outcomes than computer-based only classes.