Randall, J., Cheong, Y. F., & Engelhard, G. J. (2011). Using explanatory item response theory modeling to investigate context effects of differential item functioning for students with disabilities . Educational and Psychological Measurement , 71 (1), 129–147. https://doi.org/10.1177/0013164410391577

Journal Article

Randall, J., Cheong, Y. F., & Engelhard, G. J. (2011). Using explanatory item response theory modeling to investigate context effects of differential item functioning for students with disabilities. Educational and Psychological Measurement, 71(1), 129–147. https://doi.org/10.1177/0013164410391577

Tags

Calculation device or software (interactive); Disabilities Not Specified; Math; Middle school; No disability; U.S. context

URL

http://dx.doi.org/10.1177/0013164410391577

Summary

Accommodation

The modifications examined included basic function calculator and a special resource guide with key definitions and examples as well as graphics. In advance of completing the test, participants were provided a practice session with example items using these modifications.

Participants

Students numbering 868 in grade 7 from 74 schools throughout Georgia (U.S.) participated. This sample of the population was demonstrated to be representative of the entire Georgia student population, including by sex and by ethnicity, yet oversampled students with disabilities by design. Students with disabilities totaled 378, and students without disabilities totaled 489.

Dependent Variable

Participants completed 10 items drawn from the Georgia statewide mathematics assessment testing problem-solving skills.

Findings

The purpose of the study was to determine construct validity of the items when using modifications under separate conditions. Analysis of model fit was accomplished using descriptive item response theory models, many-facet Rasch models (MFRMs) and explanatory item response theory models, hierarchical generalized linear models (HGLMs). Students with disabilities scored lower as a group than students without disabilities across all conditions. All students using each of the modifications scored better than those taking the standard test, with no significant differences between groups across items. However, item-level analyses in both approaches yielded that item 1 was differentially more difficult for students with disabilities than for students without disabilities, yet it was also easier for students with disabilities when using the calculator condition. Also, item 10 was found to be differentially easier for students with disabilities than for students without disabilities when using each of the two modifications. These findings were similar under both analysis approaches. Limitations of the study were reported, and future research directions were suggested.