Psychometric Properties of Alternative Item Types Worth the Squeeze? An Investigation into the Psychometric Performance of Alternative Item Types

Psychometric Properties of Alternative Item Types Worth the Squeeze? An Investigation into the Psychometric Performance of Alternative Item Types

Authors

  • Senior Psychometrician, Alpine Testing Solutions, Inc., 51 W. Center Street, #514, Orem, UT 84057,
  • Director of Professional Credentialing and Senior Psychometrician, Alpine Testing Solutions, Inc., 51 W. Center Street, #514, Orem, UT 84057
  • Vice President, Examination, National Council of Architectural Registration Boards, 1401 H St NW #500, Washington, DC 20005, United States

Keywords:

Alternative Item Types, Innovative Item Types, Item Development, Item Analysis and Credentialing

Abstract

As assessments move from traditional paper-pencil administration to computer-based administration, many testing programs are incorporating alternative item types (AITs) into assessments with the goals of measuring higher-order thinking, offering insight into problem-solving, and representing authentic real-world tasks. This paper explores multiple applications of AIT items and the psychometric properties of these items, including item response time, difficulty, item-total score correlations, and distractor analyses. The appropriate use of these items is also discussed in the context of professional credentialing exams.

Downloads

Download data is not yet available.

Metrics

Metrics Loading ...

Downloads

Published

2021-03-04

How to Cite

Wolkowitz, A. A., Foley, B. P., & Zurn, J. (2021). Psychometric Properties of Alternative Item Types Worth the Squeeze? An Investigation into the Psychometric Performance of Alternative Item Types. Journal of Applied Testing Technology, 22(1), 37–51. Retrieved from http://www.jattjournal.net/index.php/atp/article/view/155940

Issue

Section

Articles

References

Dolan, R. P., Goodman, J., Strain-Seymore, E., Adams, J., & Sethuraman, S. (2011). Cognitive lab evaluation of innovative items in mathematics and English language arts assessment of elementary, middle, and high school students. [Research Report]. San Antonio, TX: Pearson.

Downing, S. M., & Haladyna, T. M. (Eds.). (2006). Handbook of test development. Mahwah, NJ: Lawrence Erlbaum Associates Publishers.

Frederiksen, N., & Satter, G. A. (1953). The construction and validation of an arithmetical computation test. Educational and Psychological Measurement, 13(2), 209–227. https:// doi.org/10.1177/001316445301300206

Haladyna, T. M., & Rodriguez, M. C. (2013). Developing and validating test items. New York, NY: Routledge. https://doi.org/10.4324/9780203850381

Hollingworth, L., Beard, J. J., & Proctor, T. P. (2007). An investigation of item type in a standards-based assessment. Practical Assessment Research & Evaluation, 12(18). Available online: https://pareonline.net/getvn.asp?v=12&n=18

Institute for Credentialing Excellence. (2017). Innovative Item Types. Washington, DC: Author.

Jodoin, M. G. (2003). Measurement efficiency of innovative item formats in computerâ€based testing. J. Educ. Meas., 40(1), 1–15. https://doi.org/10.1111/j.1745-3984.2003.tb01093.x

Lukhele, R., Thissen, D., & Wainer, H. (1994). On the relative value of multiple-choice, constructed response, and examineeselected items on two achievement tests. J. Educ. Meas., 31, 234–250. https://doi.org/10.1111/j.1745-3984.1994.tb00445.x

Parshall, C. G., & Harmes, J. C. (2008). The design of innovative item types: Targeting constructs, selecting innovations, and refining prototypes. CLEAR Exam Review, 19(2).

Rimland, B., & Zwerski, E. (1962). The use of open-end data as an aid in writing multiple-choice distracters: An evaluation with arithmetic reasoning and computation items. J. Appl. Psychol., 46(1), 31–33. https://doi.org/10.1037/h0048193

Sireci, S. G., & Zenisky, A. L. (2006). Innovative item formats in computer-based testing: in pursuit of improved construct representation. In S.M. Downing & T.M. Haladyna (Eds.), Handbook of Test Development. Mahwah, NJ: Lawrence Erlbaum Associates.

Wendt, A., & Harmes, J. C. (2009). Evaluating innovative items for the NCLEX, Part I: Usability and Pilot Testing. Nurse Educator, 34(2), 56–59. https://doi.org/10.1097/NNE.0b013e3181990849

Loading...