Kong, X., Davis, L. L., McBride, Y., & Morrison, K. (2018). Response time differences between computers and tablets . Applied Measurement in Education , 31 (1), 17–29. https://doi.org/10.1080/08957347.2017.1391261
Kong, X., Davis, L. L., McBride, Y., & Morrison, K. (2018). Response time differences between computers and tablets. Applied Measurement in Education, 31(1), 17–29. https://doi.org/10.1080/08957347.2017.1391261
The test condition of completing online items using desktop or laptop computers with keyboards was compared with using electronic tablets with touchscreens. Item response times, and not response correctness, were the central focus.
Participants were 964 high school students from across five school districts in Virginia. Test times were gathered from 479 students in the computer-based assessment condition, and 485 students in the tablet-based assessment condition. Additional demographic data, including gender (451 female) and ethnicity (615 White), were gathered for examining possible differences by these student groups. No disability data were indicated, and it is presumed that data from students with disabilities were not examined.
Item response times were measured for completing 59 high school level test items, covering academic content in reading (English II), science (Biology), and mathematics (Algebra I); performance scores were not the focus of comparisons. The source of these assessment items was not explicitly stated, yet test items appear to be akin to state assessment items. An additional factor was incorporated into participants' test task: the items were presented in several selected and constructed response item types. These included traditional multiple-choice items and technologically-enhanced items: multiple select, drag and drop, fill in the blank, graph point, hot spot, and inline choice (from drop-down box). Student participants' state reading assessment scores were compared in both tablet and computer conditions to ensure that they were not dissimilar. Participants also each completed a 10-question researcher-developed survey on their familiarity with devices and on their perceptions of test-taking during the study. [Note: Participants' initial response time effort (RTE) was estimated, which also incorporated the consideration of possible rapid-guessing behavior, in contrast to behavior toward choosing a correct solution response. Demonstrable rapid guessing responses were excluded from the time analysis. Participants' correctness scores were calculated for solution behavior and rapid-guessing behavior, not as primary data for comparison, but rather, for the purpose of checking for validity of these different test-taking behaviors.]
Relatively small but still significant response time differences were found between test devices: participants took longer to complete test items presented on tablets with touchscreen responding than on computer screens with keyboard responding. The tablet-based test section in science took longest to complete in comparison to computer-based science items, about 100 seconds longer; the differences in reading were 58 seconds longer, and in math were 52 seconds longer. Analyses of item types indicated similar patterns, except for the technological enhancement of drag-and-drop. Specifically, tablet-based items took statistically significantly longer to complete on average, except that drag-and-drop items took no longer whether completed on touchscreens or on keyboards. The researchers noted that it was expected that fingertip movement of information blocks or other objects would have taken less time than use of keyboard mouse, but no difference was found. The item type response time differences between tablet and computer were less than 10 seconds per item, with small effect sizes. These response time differences were similar in direction for students by gender and by ethnicity; that is, subgroups' means each showed these same trends in taking longer to complete tablet-based items over computer-based items in the different academic content areas. All of these patterns were shown after removing the response time data for test items that fit "rapid-guessing" patterns. Incidentally, rapid guessing behavior was estimated at 0% to 14% for the tablet format and 0% to 19% for the computer format. Very little student survey data were provided in terms of findings. Limitations of the study were reported, and future research directions were suggested.