Parshall, C. G., & Kromrey, J. D. (1993, April). Computer testing versus paper-and-pencil testing: An analysis of examinee characteristics associated with mode effect [Paper presentation]. Annual meeting of the American Educational Research Association (AERA), Atlanta, GA, United States. https://eric.ed.gov/?id=ED363272

Presentation

Parshall, C. G., & Kromrey, J. D. (1993, April). Computer testing versus paper-and-pencil testing: An analysis of examinee characteristics associated with mode effect [Paper presentation]. Annual meeting of the American Educational Research Association (AERA), Atlanta, GA, United States. https://eric.ed.gov/?id=ED363272

Tags

College entrance test; Electronic administration; Electronic administration; Electronic administration; No disability; Postsecondary; U.S. context

URL

https://eric.ed.gov/?id=ED363272

Summary

Accommodation

Students were given a computer administration of the test. Administration of the computer-administered test was preceded by a computer tutorial on the use of the test administration software.

Participants

Participants were students who opted to take a second administration of the Graduate Record Examination (GRE) General Test. They were offered a $50 honorarium for participation, and were given the option, if the pilot study proved successful, to have their scores from the computer administration of the examination added to their score records. There were 1,114 students who participated in the computer administration.

Dependent Variable

Examinees' residualized difference scores on the Verbal, Quantitative, and Analytical scales of the GRE were used as dependent variables. These scores were determined from the original paper-and-pencil test scores and the computer-administered tests. Mode effects were determined to be present when a student had a residualized difference score one standard deviation from the mean (above the mean = computer mode effect, below the mean = paper-and-pencil effect). Characteristics of examinees demonstrating mode effects were then examined. Students' test mode preference was also measured.

Findings

Across all three test scales, examinees performed significantly better on the computer version of the examination than on the paper version of the same test. However, test mode and test order were perfectly confounded in the original design of the computer test pilot study; therefore, it was suggested that practice effects may account for this difference. Data analysis indicated that a small subset of individuals demonstrated mode effect. However, none of the examinee characteristics investigated (demographic variables, computer use variables, and test taking strategy variables) were consistently related to mode effects.