Examining the Validity and Fairness of a State Standards-based Assessment of English-language Arts for Deaf or Hard of Hearing Students

Examining the Validity and Fairness of a State Standards-based Assessment of English-language Arts for Deaf or Hard of Hearing Students

Authors

  • Educational Testing Service Princeton, NJ
  • Educational Testing Service Princeton, NJ
  • Educational Testing Service Princeton, NJ
  • Educational Testing Service Princeton, NJ
  • Educational Testing Service Princeton, NJ

Abstract

This study examines the appropriateness of a large-scale state standards-based English- Language Arts (ELA) assessment for students who are deaf or hard of hearing by comparing the internal test structures for these students to students without disabilities. The Grade 4 and 8 ELA assessments were analyzed via a series of parcel-level exploratory and confirmatory factor analyses, where both groups were further split based on English language learner (ELL) status. Differential item functioning (DIF) analyses were also conducted for these groups of students, and where sample sizes were sufficient, the groups were additionally split based on test accommodation status. Results showed similar factor structures across the groups of students studied and minimal DIF, which could be interpreted as lending support for aggregating scores for Annual Yearly Progress (AYP) purposes from students who are deaf or hard of hearing.

Downloads

Download data is not yet available.

Metrics

Metrics Loading ...

Downloads

Published

2014-04-09

How to Cite

Steinberg, J., Cline, F., Ling, G., Cook, L., & Tognatta, N. (2014). Examining the Validity and Fairness of a State Standards-based Assessment of English-language Arts for Deaf or Hard of Hearing Students. Journal of Applied Testing Technology, 10(2), 1–33. Retrieved from http://www.jattjournal.net/index.php/atp/article/view/48356

Issue

Section

Articles

References

American Educational Research Association, American Psychological Association. & National Council on Measurement in Education. (1999). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.

Bagozzi, R. P., & Yi. Y. (1988). On the evaluation of structural equation models. Journal of the Academy of Marketing Science, 16, 74-94.

Bentler, P., & Wu, E. (2006). EQS 6.1 for Windows. Los Angeles: Multivariate Software, Inc.

Bolt, S. E. (2004, April). Using DIF analyses to examine several commonly-held beliefs about testing accommodations for students with disabilities. Paper presented at the Annual Meeting of the National Council on Measurement in Education, San Diego, CA.

Browne, M. W., & Cudeck, R. (1993). Alternative ways of assessing model fit. In K. A. Bollen & J. S. Long (Eds.), Testing structural equation models (pp. 136-162). Newbury Park, CA: Sage Publications.

Cahalan-Laitusis, C., Cook, L. L., & Aicher, C. (2004, April). Examining test items for students with disabilities by testing accommodation. Paper presented at the Annual Meeting of the National Council on Measurement in Education, San Diego, CA.

Cawthon, S. W. (2004). National Survey of Accommodations and Alternate Assessments for Students Who Are Deaf or Hard of Hearing in the United States. Online Research Lab, Walden University.

Cook, L. L., Eignor, D. R., Sawaki, Y., Steinberg, J., & Cline, F. (2006a, April). Using factor analysis to investigate the impact of accommodations on the scores of students with disabilities on English-language arts assessments. Paper presented at the annual meeting of the National Council on Measurement in Education, San Francisco, CA.

Cook, L. L., Eignor, D. R., Sawaki, Y., Steinberg, J., & .Cline, F. (2006b, April). Investigating the Dimensionality of an English-Language Arts Assessment Administered to English-Language Learners With and Without Accommodations. Paper presented at the annual meeting of the American Educational Research Association, San Francisco, CA.

Cook, L. L., Eignor, D. R., Steinberg, J., Sawaki, Y., & .Cline, F. (2009). Using factor analysis to investigate the impact of accommodations on the scores of students with disabilities on a reading comprehension assessment. Journal of Applied Testing Technology, XX, pp-pp.

Elbaum, B., Arguelles, M. E., Campbell, Y., & Saleh, M. B. (2004). Effects of a student-readaloud accommodation on the performance of students with and without learning disabilities on a test of reading comprehension. Exceptionality, 12(2), 71-87.

Hendrickson, A. E., & White, P. O. (1964). PROMAX: A quick method for rotation to oblique simple structure. British Journal of Statistical Psychology, 17, 65-70.

Holland, P. W., & Thayer, D. T. (1988). Differential item functioning and the Mantel-Haenszel procedure. In H. Wainer & H. I. Braun (Eds.), Test validity (pp. 129–145). Hillsdale, NJ:Erlbaum.

Hoyle, R.H., & Panter, A.T. (1995). Writing about structural equation models. In R. H. Hoyle (Ed.), Structural equation modeling: Concepts, issues, and applications. Thousand Oaks, CA: Sage Publications.

Huynh, H., & Barton, K. (2006). Performance of students with disabilities under regular and oral administrations of a high-stakes reading examination. Applied Measurement in Education, 19(1), 21-39.

Individuals with Disabilities Educational Act of 1997, 20 U.S.C. 1412(a) (17) (A). (1997). Joreskog, K. G., & Sorbom, D. (2005). LISREL 8 user’s reference guide. Chicago: Scientific Software International.

Lollis, J., & LaSasso, C. (2008). The appropriateness of the NC state-mandated reading competency test for deaf students as a criterion for high school graduation. Journal of Deaf Studies and Deaf Education Advance Access published on June 5, 2008, DOI 10.1093/deafed/enn017

Martin, P. (2005). An examination of the appropriateness of the New York state English Language Arts grade 8 test for deaf students. Unpublished doctoral dissertation, Gallaudet University, Washington D.C., March 2005.

Messick, S. (1989). Validity. In R. Linn (Ed.), Educational measurement (3rd ed., pp. 13–103). Washington, DC: American Council on Education.

No Child Left Behind Act of 2001, Pub. L. No. 107-110, 115 U.S.C. § 1425 (2002).

Pitoniak, M. J., Cook, L., Cline, F., & Cahalan-Laitusis, C. (2006, April). Using differential item functioning to investigate the impact of accommodations on the scores of students with disabilities on English-language arts assessments. Paper presented at the annual meeting of the National Council on Measurement in Education, San Francisco, CA.

SAS Institute. (2003). SAS/STAT user’s guide, Version 6. Cary, NC: author.

Stone, E. A., Cook, L. L., Cline, F., & Cahalan-Laitusis, C. (2007, April). Using differential item functioning to investigate the impact of testing accommodations on an English language arts assessment for students who are blind and visually impaired. Paper presented at the annual meeting of the National Council on Measurement in Education, Chicago, IL.

Thurlow, M., & Wiener, D. (2000). Non-approved accommodations: Recommendations for use and reporting (Policy Direction N0. 11). Minneapolis, MN: University of Minnesota, National Center for Educational Outcomes.

Tippets, E., & Michaels, H. (1997, March). Factor structure invariance of accommodated and non-accommodated performance assessments. Paper presented at the annual meeting of the National Council on Measurement in Education, Chicago.

Zieky, M.. (1993). Practical Questions in the Use of DIF Statistics in Test Development. In P. Holland & H. Weiner (Eds.), Differential Item Functioning (pp. 337–347). Hillsdale, NJ: Erlbaum.

Loading...