Fairness Principle in Accreditation of Health Specialists: The Differential Item Functioning Method
More details
Hide details
I.M. Sechenov First Moscow Medical University (Sechenov University), Moscow, RUSSIA
Ministry of Health, Moscow, RUSSIA
Institute for Strategy of Education Development of the Russian Academy of Education, Moscow, RUSSIA
Financial University under the Government of the Russian Federation, Moscow, RUSSIA
Kazan (Volga region) Federal University, Kazan, RUSSIA
Online publish date: 2019-04-12
Publish date: 2019-04-12
EURASIA J. Math., Sci Tech. Ed 2019;15(9):em1749
The main purpose of this article is to present a Differential Item Functioning method of item analysis. It is designed to minimize the discriminatory effect of individual items in the accreditation of graduates of medical universities with different training programs. The one-parameter Item Response Theory model is used to align graduates’ rights in accreditation. The measure of the difference in the location of the item characteristic curves constructed for different graduates’ samples is presented using the difficulty estimates. For the development of the methodology, a number of research questions are posed, the solution of which made it possible to analyze the items bias for the “Polyclinic” discipline and the range of item difficulties from 1.5 to 2, 5 logits. The item bank was cleaned and an interpretation of decisions on items release and correction for the cases of different arrangement of their characteristic curves was presented.
Aydarov, V. I., & Krasilnikov, V. I. (2017). The role of the moral component of personality in ensuring security in medicine. Vestnik NTSBGD, 2(32), 117-121.
Baig, L. A., & Violato, C. (2012). Temporal stability of objective structured clinical exams: a longitudinal study employing item response theory. BMC Medical Education, 12(121), 1-6.
Baker, F. B., & Kim, S. H. (2004). Item response theory. Parameter estimation techniques. New York: Dekker.
Berk, R. A. (1980). Criterion-referenced measurement: The state of art. Baltimor: Johns Hopkins University Press.
Chelyshkova, M. (2002). Theory and practice of educational tests construction: the manual. Moscow: Logos.
Chelyshkova, M. B., Semenova, T. V., Naydenova, N. N., Dorozhkin, E. M., Malygin, A. A., & Akhunov, V. V. (2018). Сross-analysis of big data in accreditation of health specialists. Electronic Journal of General Medicine, 15(5), em72. Retrieved from
Cohen, A. S., Kim, S. H., & Baker, F. B. (1993). Detection of differential item functioning in the graded response model. Applied Psychological Measurement, 17, 335–350.
Crocker, L., & Algina, J. (2010). Introduction to classical and modern test theory. Under the editorship of V. I. Zvonnikov and M. B. Chelyshkova. Moscow: Logos Publ.
De Ayala, R. (2009). The Theory and Practice of Item Response Theory. New York: Guilford Press.
Dorans, N. J., & Kulick, E. (1986). Demonstrating the utility of the standardization approach to assessing unexpected differential item performance on the Scholastic Aptitude Test. Journal of Educational Measurement, 23, 355–368.
Dorozhkin, E. M, Chelyshkova, M. B., Malygin, A. A., Toymentseva, I. A., & Anopchenko, T. Y. (2016). Innovative approaches to increasing the student assessment procedures effectiveness. International Journal of Environmental and Science Education, 11(14), 7129-7144.
Filaretov, V. A., Danilov, V. A., & Golovchenko, N. I. (2014). Prevention of burnout syndrome. Vestnik NTSBGD, 1(19), 63-66.
Gorbatkova, E. Yu., & Zabrodina, G. Yu. (2015). Student lifestyle and health. Vestnik NTSBGD, 3(25), 102-104.
Holland, P. W., & Wainer, H. (1993). Differential Item Functioning. London: Routledge.
Keeves, J. P. (1988). Edukational Reseach Methodology, and Measurement. New York: Perg. Press.
Kramer, D. (2007). Mathematical data processing in social sciences: modern methods: studies. The grant for students of higher educational institutions. Мoscow: Publishing Centre “Academy”.
Lavrakas, P. J. (2008). Encyclopedia of survey research methods. Thousand Oaks. California: Sage Publications, Inc.
Linacre, J. M., & Wright, B. D. (1989). Mantel-Haenszel DIF and PROX are Equivalent! Rasch Measurement Transactions, 3(2), 51–53.
Lord, F. M. (1980). Application of item response theory to practical testing problems. Hillsdale: Erlbaum.
Naydenova, N. N. (2003). Formation of representative samples. Moscow: Logos.
Naydenova, N. N., Tagunova, I. A., & Sukhin, I. G. (2018). The Problem of Interpretation in Comparative Education (Interdisciplinary Approach). International Conference “Education Environment for the Information Age” (EEIA-2018), 05-06 June 2018, pp. 485-492.
Penfield, R. D., & Camilli, G. (2007). Differential item functioning and item bias, in Handbook of statistics. New York: Elsevier.
Semenova, T., Sizova, Zh., Chelyshkova, M., Dorozhkin, Е., & Malygin, A. (2018). Fairness and Quality of Data in Healthcare Professionals’ Accreditation. Modern Journal of Language Teaching Methods, 15(5), em72. Retrieved from
Semenova, T., Sizova, Zh., Zvonnikov, V., Masalimova, A., & Ersozlu, Z. (2017). The Development of Model and Measuring Tool for Specialists Accreditation. EURASIA Journal of Mathematics, Science and Technology Education, 13(10), 6779–6788.
Tumeneva, J., & Valdman, A. (2012). The first data from comparative analysis of the results on TIMSS-2011 and PISA – 2012 tests, administered to the same sample of Russian students. International Conference “Russian Education in the Mirror of the International Comparative Studies”. Retrieved from
Woods, C.M. (2011). DIF testing for ordinal items with Poly-SIBTEST, the mantel and GMH tests, and IRT-LR-DIF when the latent distribution is non-normal for both groups. Applied Psychological Measurement, 35(2), 145–164.
Yazdani, S., Azandehi, S.K., Ghorbani, A., & Shakerian, S. (2018). Explaining the process of choosing clinical specialties in general medical graduates: A grounded theory. Electronic Journal of General Medicine, 15(6), em89.
Zumbo, B. D. (1999). A handbook on the theory and methods of differential item functioning. Ottawa: Directorate of Human Resources Research and Evaluation, Department of National Defense.