Document Type : Research Paper

Authors

Abstract

Background: Discussion about invariance of questions and tests is an important issue in assessment.
Objectives: The present study was conducted to compare the invariance of the parameters in item-response theory and confirmatory factor analysis.
Methods: After reviewing the related basics of each approach, the researcher compared the invariance of the parameters in each approach based on empirical data result from International Reading Literacy Study (PIRLS) Test. The sample was 5000 Iranian students (half female and half male) in 2006 who responded to six questions which were related to the scale of attitude toward reading.
Results: Data analysis showed that question 6 is biased using both item-response theory and confirmatory factor analysis. The results, however, were different considering questions 1, 3 and 4. Question 1 was found to be biased based on item-response theory only; questions 3 and 4, on the other hand, were found to be biased based on confirmatory factor analysis.
Conclusion: It is suggested that both approaches be employed when deciding on the invariance of the parameters, since making decisions otherwise will be misleading. Also, it is offered that intercept and differences in the distribution of the ability of the groups and their effects on the invariance be considered as primary.



Keywords

ایزانلو، بلال؛ حبیبی، مجتبی. (1386). مقدمه‏ای برمبانی رویکردهای جدید اندازه‏گیری در حوزه روان­شناسی وعلوم تربیتی. فصلنامه روا‏ن‏شناسی و علوم تربیتی.شماره 8: 165-135.
 
 
منابع لاتین
Ackerman, T. A. (1992). A didactic explanation of item bias, item impact, and item validity from a multidimensional perspective. Journal of educational measurement, 29(1), 67-91.
Baker, F. (1995). EQUATE 2.1: Computer program for equating two metrics initem response theory [Computer program]. Madison: University of Wisconsin, Laboratory of Experimental Design.
Baker, F. B. (2004). Item response theory: Parameter estimation techniques (Vol. 176): CRC.
Braun, H. I. (1988). Test validity: Lawrence Erlbaum.
Byrne, B. M. (1998). Structural equation modeling with LISREL, PRELIS, and SIMPLIS: Basic concepts, applications, and programming: Lawrence Erlbaum.
Drasgow, F., & Kanfer, R. (1985). Equivalence of psychological measurement in heterogeneous populations. Journal of Applied Psychology, 70(4), 662.
Du Toit, M. (2003). IRT from SSI: Bilog-MG, multilog, parscale, testfact: Scientific Software.
Embretson, S. E., & Reise, S. P. (2000). Item response theory for psychologists (Vol. 4): Lawrence Erlbaum.
Flowers, C. P., Raju, N. S., & Oshima, T. (2002). A Comparison of Measurement Equivalence Methods Based on Confirmatory Factor Analysis and Item Response Theory.
Hambleton, R. K. (1997). Handbook of modern item response theory: Springer Verlag.
Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory (Vol. 2): Sage Publications, Inc.
Linden, W., & Hambleton, R. K. (1997). Handbook of modern item response theory. New York.
Morales, L. S., Flowers, C., Gutierrez, P., Kleinman, M., & Teresi, J. A. (2006). Item and scale differential functioning of the Mini-Mental State Exam assessed using the differential item and test functioning (DFIT) framework. Medical care, 44(11 Suppl 3), S143.
Morris, B. S. (2009). Polycov [Computer program]. Chicago: Illinois Institute of Technology.
Mullis, I. V. S., Martin, M. O., Kennedy, A. M., & Foy, P. (2007). PIRLS 2006 international report. Boston: IEA (http://pirls. bc. edu/isc/publications. html# p06, 30.6. 2008).
Oshima, T., Kushubar, S., Scott, J., & Raju, N. (2009). DFIT8 for Window User’s Manual: Differential functioning of items and tests. St. Paul MN: Assessment Systems Corporation.
Oshima, T., Raju, N. S., & Nanda, A. O. (2006). A new method for assessing the statistical significance in the differential functioning of items and tests (DFIT) framework. Journal of educational measurement, 43(1), 1-17.
Raju, N. S., Laffitte, L. J., & Byrne, B. M. (2002). Measurement equivalence: A comparison of methods based on confirmatory factor analysis and item response theory. Journal of Applied Psychology, 87(3), 517.
Reise, S. P., Widaman, K. F., & Pugh, R. H. (1993). Confirmatory factor analysis and item response theory: Two approaches for exploring measurement invariance. Psychological Bulletin, 114(3), 552.
Roju, N. S., van der Linden, W. J., & Fleer, P. F. (1995). IRT-based internal measures of differential functioning of items and tests. Applied Psychological Measurement, 19(4), 353-368.
Schmitt, N., & Kuljanin, G. (2008). Measurement invariance: Review of practice and implications. Human Resource Management Review, 18(4), 210-222.
Sireci, S. G., & Allalouf, A. (2003). Appraising item equivalence across multiple languages and cultures. Language testing, 20(2), 148.
Thissen, D. (1991). MULTILOG user’s guide. Chicago: Scientific Software.
Vandenberg, R. J., & Lance, C. E. (2000). A review and synthesis of the measurement invariance literature: Suggestions, practices, and recommendations for organizational research. Organizational research methods, 3(1), 4-70.