Comparative Analysis of Question Item Parameters and Students' Ability between Dichotomy and Polytomic Score Versions; Research on Mathematics National Exam Test Participants

Authors

  • Reza Oktiana Akbar IAIN Syekh Nurjati Cirebon

DOI:

https://doi.org/10.58557/ijeh.v1i4.29

Keywords:

Student’s Ability, Dichotomy Scoring, Politomy Scoring, MLE, EAP

Abstract

The purpose of this study was to determine the differences in the estimation of item parameters and students' abilities between the dichotomous scoring version and the polytomy scoring version. The data used in this study is the response of participants in a particular year national mathematics exam test which is presented in two versions of the scoring form, namely dichotomy and polytomy. In the dichotomous scoring version, each item is scored individually. Alternatively, each item that is scored separately on the dichotomous version, is aggregated into the same indicator, and then summed to obtain the polytomy version. Estimation of item parameters and students' abilities in the dichotomous scoring version was carried out using the 3 PL (Logistics Parameter) model using MLE estimation and the polytomy scoring version using the GPCM (Generalized Partial Credit Model) using EAP. Both were analyzed using PARSCALE software. Comparative analysis of the two scoring versions by looking at the average results of the estimated difficulty level, graph analysis, calculating correlations, and the results of the value of the information function. The results of the analysis show that the average difficulty level of the dichotomous version is 0.166, with a standard deviation of 1.137, while the average difficulty level of the polytomy version is 0.033, with a standard deviation of 0.940. The value of the dichotomous scoring version of the information function is higher than the polytomy scoring version. These results indicate that the math exam test with the dichotomous scoring version is better than the polytomy scoring version. graph analysis, calculate correlations, and result in value information functions. The results of the analysis show that the average difficulty level of the dichotomous version is 0.166, with a standard deviation of 1.137, while the average difficulty level of the polytomy version is 0.033, with a standard deviation of 0.940. The value of the dichotomous scoring version of the information function is higher than the polytomy scoring version

References

Du Toit, M. (Ed.). (2003). IRT from SSI: Bilog-MG, multilog, parscale, testfact. Scientific Software International.

Hambleton, R. K., & Swaminathan, H. (1985). A Look at Psychometrics in the Netherlands.

Hambleton, R. K., Swaminathan, H., & Rogers, D. J. (1991). Fundamentals of Item Response Theory.

Muraki, E. (1997). A generalized partial credit model. In Handbook of modern item response theory (pp. 153-164). Springer, New York, NY.

Retnawati, H. (2015). Perbandingan Estimasi Kemampuan Laten Antara Metode Maksimum Likelihood Dan Metode Bayes. Jurnal Penelitian Dan Evaluasi Pendidikan, 19(2), 145–155. https://doi.org/10.21831/pep.v19i2.5575

Retnawati, H. (2018). Mengestimasi Kemampuan Peserta Tes Uraian Matematika dengan Pendekatan Teori Respons Butir dengan Penskoran Politomi dengan Generalized Partial Credit Model.

Du Toit, M. (Ed.). (2003). IRT from SSI: Bilog-MG, multilog, parscale, testfact. Scientific Software International.

Hambleton, R. K., & Swaminathan, H. (1985). A Look at Psychometrics in the Netherlands.

Hambleton, R. K., Swaminathan, H., & Rogers, D. J. (1991). Fundamentals of Item Response Theory.

Muraki, E. (1997). A generalized partial credit model. In Handbook of modern item response theory (pp. 153-164). Springer, New York, NY.

Retnawati, H. (2018). Mengestimasi Kemampuan Peserta Tes Uraian Matematika dengan Pendekatan Teori Respons Butir dengan Penskoran Politomi dengan Generalized Partial Credit Model.

Downloads

Published

2021-12-28

How to Cite

Akbar, R. O. (2021). Comparative Analysis of Question Item Parameters and Students’ Ability between Dichotomy and Polytomic Score Versions; Research on Mathematics National Exam Test Participants. International Journal of Education and Humanities, 1(4), 171–180. https://doi.org/10.58557/ijeh.v1i4.29