Davis, L. L., Morrison, K., Schnieders, J. Z.-Y., & Marsh, B. (2021). Developing authentic digital math assessments . Journal of Applied Testing Technology , 22 (1), 1–11. https://jattjournal.net/

Journal Article
Davis, L. L., Morrison, K., Schnieders, J. Z.-Y., & Marsh, B. (2021). Developing authentic digital math assessments. Journal of Applied Testing Technology, 22(1), 1–11. https://jattjournal.net/


Electronic administration; High school; Math; No disability; Student survey; Technological aid; U.S. context





"Technology-Enhanced Assessments and Items" (TEI) and traditional input (response) methods were compared; these methods were not identified as accommodations per se, but as universally available general assessment response formats. Test item response conditions included digital stylus on screen, keyboard typing, and traditional paper-and-pencil written responses.


One hundred high school students enrolled in Algebra II or Pre-calculus classes at a school recognized as a leader in using digital ink and tablet applications in the classroom participated. All students had experience using digital styluses with digital ink applications. Student demographic data including gender and ethnicity were collected; no disabilities were noted.

Dependent Variable

A mathematics assessment consisting of 18 constructed response items was created by ACT content developers specifically for this study. Math content included algebra, geometry, and numbers/quantities. Half of the items called for math process skills such as traditional computation, and the other half required math practice skills in which students provided answers and explained their rationales, such as demonstrating the application of formulas to reach solutions. A repeated measures design was implemented with all students participating in "three input mechanism conditions": digital stylus, keyboard, and paper-and-pencil. After participants completed the math assessment, each participant responded to a 13-question survey about their experiences.


Mean performance scores showed no statistically significant differences across item response input conditions. When comparing scores by response conditions for math content areas and for math construct type, there were slight group mean differences yet none that reached statistical significance. Student surveys showed preferences among the response conditions: paper-and-pencil was the most highly preferred, with digital stylus preferred to keyboard. Limitations of the study were reported, and future research possibilities were suggested.