Davis, L. L., Orr, A., Kong, X., & Lin, C.-H. (2015). Assessing student writing on tablets . Educational Assessment , 20 (3), 180–198. https://doi.org/10.1080/10627197.2015.1061426
Davis, L. L., Orr, A., Kong, X., & Lin, C.-H. (2015). Assessing student writing on tablets. Educational Assessment, 20(3), 180–198. https://doi.org/10.1080/10627197.2015.1061426
Researchers initially developed five equipment conditions, which included tablets either with or without styluses; they later designed combinations of conditions into three testing formats for analysis: laptop computer, tablet with external keyboard, and tablet with onscreen keyboard. Additional details were provided about the screen sizes and equipment makes and models for the laptop and tablet devices.
Valid writing test responses were submitted by 387 grade 5 students and 439 high school students (in grades 10 and 11). The participants' schools included eight schools in five school districts in the Sioux Falls, South Dakota area and three schools in Isle of Wight, Virginia. Demographic data such as gender and race/ethnicity were also reported. Students with disabilities were not specifically reported to be included or excluded from participation. Students' state assessment reading performance data or previous writing assessment scores were employed as co-variates.
The writing assessment task comprised a set of Pearson WriteToLearn essay prompts at grade level for grade 5 and high school grades 10 and 11, selected by a content specialist in English language arts. The prompts were designed to elicit responses of 350 to 450 words. Students composed their responses to one essay prompt in one testing condition each (randomly assigned), then copied and pasted the essay to the text box on their screens. The essays were analyzed using software which documented seven features: character count total, content word character count average, content words percent (excluding "function words"), misspelled words percentage, sentence count total, word count total, and words per sentence average. The essays were evaluated by professional scorers (who were shielded from the essays' assigned conditions), scored using a six-point-scale rubric. Participants were also requested to answer a 15-item survey about device use experience and their assessment task experience.
An early conclusion based on observation was that students typically did not use styluses in general, or specifically for revising essays—in fact, students tended not to revise essays even though they were provided the opportunity and instructions to allow them to do so. Task performance in terms of the essay features was not significantly different across testing formats by schooling level. The mean evaluation scores were not significantly different within each schooling level across testing formats. In other words, grade 5 students and high school students did not vary in performance based on testing format—whether completing the writing task using a laptop or a tablet with an external keyboard or a tablet with a touch-screen keyboard. According to survey results, students' use of these technologies varied across settings, with more use of devices with touchscreens at home, and more use of computers (without touchscreens) at school. Also, when writing essays at school, students most commonly composed on paper or on computers (without touchscreens) interchangeably; students rarely used only one format, and even fewer used touchscreens. However, when asked about ease of use, few students indicated difficulty with tablets with touchscreen keyboards, yet more high school students than grade 5 students indicated that using touchscreens was "somewhat difficult." High school students tended to prefer physical keyboards over touchscreens for writing compositions. Limitations of the study were reported.