Differential Item Functioning Comparisons on a Performance-based Alternate Assessment for Students with Severe Cognitive Impairments, Autism and Orthopedic Impairments

Differential Item Functioning Comparisons on a Performance-based Alternate Assessment for Students with Severe Cognitive Impairments, Autism and Orthopedic Impairments

Authors

  • Educational Testing Service
  • Educational Testing Service
  • Educational Testing Service
  • University of North Carolina at Charlotte

Abstract

The purpose of this study was to examine Differential Item Functioning (DIF) by disability groups on an on-demand performance assessment for students with severe cognitive impairments. Researchers examined the presence of DIF for two comparisons. One comparison involved students with severe cognitive impairments who served as the reference group and students with autism and severe cognitive impairments who served as the focal group. The other comparison compared students with severe cognitive impairments (reference group) and students with severe cognitive impairments and orthopedic impairments (focal group). Results indicated a moderate amount of DIF for the autism comparison and a negligible amount of DIF for the orthopedic impairment comparison. In addition researchers coded all test items based on characteristics likely to favor one of the three groups. Although several of the hypothesized coding categories resulted in accurate prediction of DIF, the study was limited to items from one testing program for students in one state. More research is needed to see if these hypotheses can be replicated across testing programs and populations.

Downloads

Download data is not yet available.

Metrics

Metrics Loading ...

Downloads

Published

2014-04-09

How to Cite

Laitusis, C. C., Maneckshana, B., Monfils, L., & Ahlgrim-Delzell, L. (2014). Differential Item Functioning Comparisons on a Performance-based Alternate Assessment for Students with Severe Cognitive Impairments, Autism and Orthopedic Impairments. Journal of Applied Testing Technology, 10(2), 1–33. Retrieved from http://www.jattjournal.net/index.php/atp/article/view/48351

Issue

Section

Articles

References

American Psychiatric Association (2000). Diagnostic and Statistical Manual of Mental Disorders Fourth Edition Text Revision (DSM-IV-TR). Washington, DC: American Psychiatric Association.

Bielinski, J., Thurlow, M., Ysseldyke, J., Freidebach, J., & Freidebach, M. (2001). Read-Aloud Accommodations: The Effect on Multiple Choice reading and Math Items (Technical Report 31). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Retrieved February 2, 2004 from: http://education.umn.edu/NCEO/OnlinePubs/Technical31.htm

Bolt, S. E., & Bielinski, J. (2002, April). The effects of the read-aloud accommodation on math test items. Paper presented at the annual meeting of the National Council on Measurement in Education, New Orleans, LA.

Bolt, S. E., & Yesseldyke, J. E. (2006). Comparing DIF across math and reading/language arts test for students receiving a read-aloud accommodation. Applied Measurement in Education, 19(4), 329-355.

Cahalan-Laitusis, C., Cook, L., & Aicher, C. (April, 2004). Examining Test Items for Students with Disabilities by Testing Accommodation on Assessments of English Language Arts. Paper presented at the annual meeting of the National Council on Measurement in Education, San Diego, CA.

Camilli, G. (1992). A conceptual analysis of differential item functioning in terms of a multidimensional item response model. Applied Psychological Measurement, 16(2), 129-147.

Camilli, G. (1993). The case against item bias techniques based on internal criteria: Do item bias procedures obscure test fairness issues? In P. W. Holland & H. Wainer (Eds.), Differential item functioning (pp. 397-413). Hillsdale, NJ: Lawrence Erlbaum Associates.

Dorans, N. J., & Schmitt, A. P. (1991). Constructed response and differential item functioning: A pragmatic approach. (Research Report No. 91-47). Princeton, NJ: Educational Testing Service.

Individuals with Disabilities Education Act (IDEA), 20 U.S.C. S 1400 et seq. (1997).

Individuals with Disabilities Education Act (IDEA), 20 U.S.C. S 1400 et seq. (2004).

Koretz, D., & Hamilton, L. (2000). Assessment of Students with Disabilities in Kentucky: Inclusion, Student Performance, and Validity. Educational and Policy Analysis, 22, 255-272.

Maneckshana, B., Laitusis, C. C., & Monfils, L. (2006, April). Comparisons of Disability Categories for an Alternate Assessment. Paper presented at the Annual Meeting of the National Council on Measurement in Education, San Francisco, CA.

Mantel, N. (1963). Chi-square tests with one degree of freedom: Extensions of the Mantel-Haenszel procedure. Journal of the American Statistical Association, 58, 690-700.

Muthen, B., Huang, L.-C., Jo, B., Khoo, S.-T., Goff, G. N., Novak, J. R., & Shih, J. C. (1995). Opportunity-to-Learn effects on achievement: Analytical aspects. Educational Evaluation and Policy Analysis, 17(3), 371-403.

No Child Left Behind Act of 2001, Pub. L. No. 107-110, 115 U.S.C. § 1425 (2002).

No Child Left Behind Alternate Achievement Standards for Students with the Most Significant Cognitive Disabilities Non-Regulatory Guidance, Department Regulation 34 C.F.R. Part200 (December 9, 2003). Available at www.ed.gov/legislation/FedRegister/finrule/2003-4/120903a.html

Pitoniak, M. J., & Royer, J. M. (2001). Testing accommodations for examinees with disabilities: a review of psychometric, legal, and social policy issues. Review of Educational Research, 71, 53-104.

Pitoniak, M. J., Cook, L. L., & Laitusis, C. C. (2006, April). Using Differential Item Functioning Analyses to Investigate the Impact of Testing Accommodations on an English Language Arts Assessment. Paper presented at the Annual Meeting of the National Council on Measurement in Education, San Francisco, CA.

Roeber, E. (2002). Setting standards on alternate assessments (Synthesis Report 42). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Retrieved February 15, 2006, from: http://education.umn.edu/NCEO/OnlinePubs/Synthesis42.html

Shealy, R., & Stout, W. F. (1993a). A model-based standardization approach that separates true bias/DIF from group differences and detects test bias/DIF as well as item bias/DIF. Psychometrika, 58, 159-194.

Shealy, R., & Stout, W. F. (1993b). An item response model for test bias and differential item functioning. In P. W. Holland & H. Wainer (Eds.), Differential item functioning. Hillsdale, NJ: Lawrence Erlbaum.

Sireci, S. G., Li, S., & Scarpati, S. (2003). The effects of test accommodations on test performance: A review of the literature. Center for Educational Assessment Research Report No. 485, Amherst, MA: School of Education, University of Massachusetts Amherst.

Thompson, S. J., Johnstone, C. J., Thurlow, M. L., & Altman, J. R. (2005). 2005 State special education outcomes: Steps forward in a decade of change. Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Retrieved February 8, 2009 from: http://education.umn.edu/NCEO/OnlinePubs/2005StateReport.htm/

Zwick, R., Donoghue, J. R., & Grima, A. (1993). Assessment of differential item functioning for performance tasks. Journal of Educational Measurement, 30(3), 233-251.

Loading...