Embedded Standard Setting for Credentialing

Embedded Standard Setting for Credentialing

Authors

  • Creative Measurement Solutions, LLC
  • National Restaurant Association, LLC
  • National Restaurant Association, LLC

Keywords:

Standard Setting, Principled Assessment Design, Cut Scores, Credentialing

Abstract

Embedded Standard Setting (ESS; Lewis & Cook, 2020) transforms standard setting from a standalone workshop to an active part of the assessment development lifecycle. ESS purports to lower costs by eliminating the standard-setting workshop and enhance the validity argument by maintaining a consistent focus on the evidentiary relationship between test items and their measurement targets throughout the test development lifecycle. ESS is currently employed by multiple K-12 educational state consortia and test publishers to support the coherence of their testing programs and fulfil their standard-setting requirements. A fundamental requirement of ESS is the development of Performance Level Descriptors (PLDs) to explicate the construct across the levels of performance. Formal PLDs are not common in the credentialing industry but the commonly developed Job Task Analysis should be sufficient to support ESS requirements. The current study investigated whether ESS could be adapted for the credentialing industry. The authors describe the methods and results of an ESS pilot study and discuss how credentialing test development practices could be modestly modified to support the application of ESS.

Downloads

Download data is not yet available.

Metrics

Metrics Loading ...

Downloads

Published

2024-03-07

How to Cite

Lewis, D., Graw, M., & Baker, M. (2024). Embedded Standard Setting for Credentialing. Journal of Applied Testing Technology. Retrieved from http://www.jattjournal.net/index.php/atp/article/view/173201

Issue

Section

Articles

References

Ferrara, S., & Lewis, D. M. (2012). The item descriptor matching standard setting procedure. In G. J. Cizek (Ed.) Setting performance standards: Concepts, methods, and perspectives (2nd ed., pp. 255–282). New York, NY: Routledge.

Impara, J. C., & Plake, B. S. (1997). Standard setting: An alternative approach. Journal of Educational Measurement, 23, 35–56. https://doi.org/10.1111/j.1745-3984.1997.tb00523.x

Lewis, D. (2023). A bottom-up approach to instruction and assessment: Supporting validity with embedded standard setting. Paper presented at the 2023 Annual Meeting of the National Council on Measurement in Education, New Orleans.

Lewis, D., & Cook, R. (2023). Automating the estimation of cut scores Via ESS. Paper presented at the 2023 Annual Meeting of the National Council on Measurement in Education, New Orleans.

Lewis, D., & Cook, R. (2020). Embedded standard setting: Aligning standard setting methodology with contemporary assessment design principles. Educational Measurement: Issues and Practices, 39(1), 8–21. https:// doi.org/10.1111/emip.12318 Lewis D., & Lee, S. (2021). EmStanS® Embedded Standard Setting software. Seattle, WA: Creative Measurement Solutions LLC.

Lewis, D., Lee, S., & Choi, S. (2021). An investigation of two Embedded Standard Setting cut score estimation algorithms: ESS-Count and ESS-Weight. In D. Lewis (Organizer), Embedded Standard Setting: Research and Advances. Symposium at the 2021 annual meeting of the National Council on Measurement in Education, virtual.

Lewis, D. M., Mitzel, H. C., & Green, D. R. (1996). Standard setting: A bookmark approach. In D. R. Green (Chair), IRT-based standard-setting procedures utilizing behavioral anchoring. Paper presented at the 1996 Council of Chief State School Officers National Conference on Large Scale Assessment, Phoenix.

Lewis, D. M., Mitzel, H. C., Mercado, R., & Schulz, M. (2012). The bookmark standard setting procedure. In G. J. Cizek (Ed.), Setting performance standards: Concepts, methods, and perspectives (2nd ed., pp. 225–254). New York, NY: Routledge.

National Commission for Certifying Agencies. (2021). Standards for the accreditation of certification programs. Institute for Credentialing Excellence, Washington, DC.

Rasch, G. (1980), Probabilistic models for some intelligence and attainment tests. Chicago, IL: University of Chicago Press. (Original work published 1960).

Loading...