Hansen, E. G., Liu, L., Rogat, A., & Hakkinen, M. T. (2016). Designing innovative science assessments that are accessible for students who are blind . Journal of Blindness Innovation and Research , 6 (1). https://doi.org/10.5241/6-91
Hansen, E. G., Liu, L., Rogat, A., & Hakkinen, M. T. (2016). Designing innovative science assessments that are accessible for students who are blind. Journal of Blindness Innovation and Research, 6(1). https://doi.org/10.5241/6-91
The conditions of the science assessment task included "(a) screen reader and supplementary non-speech audio, (b) game-controller-based haptics, (c) tablet-based vibrotactile haptics, and (d) tactile graphics" [from Abstract]. This case study investigation sought student usability feedback about these ways of receiving and responding to the science task.
Three students who were blind and attending grades 8–9 at a Texas residential school were interviewed. Participants were selected who had little or no sight and therefore used no sight to access testing materials and who had no other barriers such as cognitive disabilities or English language but instead had average or higher reading skills. Additional personal, academic, and experiential information was reported for these student participants.
The qualitative data collected included students' responses to open-ended and rating-scale questions during the background interview, the researchers' observation form, and the post-session interview. The researchers collected data checking for whether students were able to accurately access basic information provided in the simulation assessment task: their report of the number of particles being displayed (the actual number was 10). The researchers also documented their observations of students while engaging in the science-content task simulations using various accessibility tools enhancing visual aspects of the information needed for completing the task by students with limited or no sight. During a post-task interview, student participants provided reflective feedback about their experiences completing the science assessment task under various conditions.
Of the four conditions—(1) screen reader and sound-only static and dynamic simulation, (2) Falcon controller knob haptic static and dynamic simulation, (3) Android touch-screen static only simulation, and (4) tactile graphic paper-based static simulation—all three students most accurately recognized the information in the fourth (paper-based / static tactile graphic) condition. Both of the device-based haptic simulations (#2 & #3) were mostly unsuccessful communicating needed basic information to the students for understanding the task. The exception is that one student correctly identified the number of particles under the Falcon haptic condition during the third stage of the simulation. The control (#1) condition had both static and dynamic parts. The screen reader (JAWS) was usable for students, yet students answered some scaling questions with "agree" or "neither agree nor disagree"—possible indicating limited enthusiasm. They were able to navigate through at least some of the information, but the format of tables was not navigable for two students. When engaging with the dynamic simulation—in which sound-only information was presented (about particle collisions)—students all expressed confusion about what information the sounds were intended to relay. When asked about each of the three quasi-experimental conditions, two of the three students indicated strongly that they would recommend the Falcon tool; however, observations indicated all students had difficulty with locating the particles in the three-dimensional space. Students all had similar difficulties locating the particles when using the Android haptics in the two-dimensional space due to their relative size, and possibly distinguishing the vibrations from other information when they did. Further, information about the particles were important to demonstrate understanding of the concepts being tested, so these difficulties locating the particles interfered with student task performance. Students indicated that the tactile graphics were both familiar and easy to use, among other similar scaling interview questions. The two dynamic simulations seemed to engage and motivate students, and two indicated that they were interesting, and at least one indicated having learned a central concept during the simulation. The researchers concluded that usability issues limited the utility of the haptic tools, and offered recommendations on ways to address these challenges, such as use of multi-touch (i.e., more than one finger) haptics. They stated, "visually disabled individuals needed greater levels of support in order to make this simulation-based assessment task accessible and usable" (p. 27 of 34). All of these findings have implications on incorporating these types of assistive technology-based features into science assessment items in designing testing for students who are blind. Limitations of the study were reported.