Title: Detecting gender-biased items in a high-stakes language proficiency test: using Rasch model measurement

Authors: Soodeh Bordbar; Seyyed Mohammad Alavi

Addresses: Iran University of Medical Science, Hemmat Highway, Tehran, Iran ' University of Tehran, Kargar Shomali St., Tehran, Iran

Abstract: The present study explores the validity of a high-stakes university entrance exam and considers the role of gender as a source of bias in different subtests of this language proficiency test. To achieve this, the Rasch model was used to inspect biased items and to examine the construct-irrelevant factors. To obtain DIF analysis, the Rasch model was used for 5,000 participants who were selected randomly from a pool of examinees taking part in the National University Entrance Exam in Iran for Foreign Languages (NUEEFL) as a university entrance requirement for English language studies in 2015. The findings reveal that the test scores are not free from construct-irrelevant variance and some misfit items were modified based on the fit statistics suggestions. In sum, the fairness of the NUEEFL was not confirmed. The results obtained from such psychometric assessment could be beneficial for test designers, stake-holders, administrators, as well as teachers.

Keywords: bias; differential item functioning analysis; dimensionality; fairness; Rasch model.

DOI: 10.1504/IJQRE.2021.119817

International Journal of Quantitative Research in Education, 2021 Vol.5 No.3, pp.277 - 310

Accepted: 03 Jun 2021
Published online: 21 Dec 2021 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article