National Taiwan Normal University
教育心理學報  回首頁
Apycom jQuery Menus
424 publication date:Jun, 2011
The Effect of Model Misspecification on Computerized Testing
    Author:Hung-Yi Lu, Yung-Feng Hsu, Kuo-Sung Hsueh
Research Article

 Item response theory (IRT) has been widely applied in computerized adaptive testing (CAT) with the logistic type models most often used. IRT prescribes an item characteristic curve that provides the probability of an examinee correctly answering an item with a parameter of a given ability level. Examiners can develop various tests for different purposes based on a chosen item response model. However, in actual testing practice, the priori item response model is often unknown. The purpose of this study is to examine the effect of model misspecification on computerized testing. Using norm-referenced testing, results indicated that model misspecification has an effect on the estimate of examinees’ abilities. Both the RMSE and test length significantly increased when the wrong item response models were used, especially when item bank belongs to three-parameter logistic model. In criterion-referenced testing, model misspecification has no effect on the accuracy of classification. However, it will increase test length and cost of testing.



下載


關鍵詞: computerized adaptive testing, item response theory, model misspecification


The Process of Strength-Based Career Counseling and the Content of Its Therapeutic Efficacy for Dual-Career Women

Copyright © 2024 Bulletin of Educational Psychology
Address: No. 162 Hoping E. Rd. Sec. 1, Taipei 10610, Taiwan, R.O.C.
Tel: (02) 77493757, Fax: (02) 23413865, Email: t05002@ntnu.edu.tw
All rights reserved. BEISU Design