Document Type : Special Issue
Non-profit Joint Stock Company Semey Medical University, Department of Hospital Surgery, 071400-Semey City, Abai Street, 103, Republic of the Kazakhstan.
College of Education, Al-Farahidi University, Baghdad, Iraq
College of Education, The Islamic University in Najaf, Iraq
English Department, Ahl-Al-Bayt University, Kerbala, Iraq
Department of Medical Laboratory Technics, Al-Zahrawi University College, Karbala, Iraq
Al-Nisour University College, Baghdad, Iraq
Institute of Natural Sciences and Geography, Abai Kazakh National Pedagogical University, Almaty, Kazakhstan
People’s Friendship University of Russia, Moscow, Russia
Candidate of Biological Sciences, Docent, Institute of Natural Sciences and Geography, Abai Kazakh National Pedagogical University, Almaty, Kazakhstan.
Multiple-choice (MC) item format is commonly used in educational assessments due to its economy and effectiveness across a variety of content domains. However, numerous studies have examined the quality of MC items in high-stakes and higher education assessments and found many flawed items, especially in terms of distractors. These faulty items lead to misleading insights about the performance of students and the final decisions. The analysis of distractors is typically conducted in educational assessments with multiple-choice items to ensure high quality items are used as the basis of inference. Item response theory (IRT) and Rasch models have received little attention for analyzing distractors. For that reason, the purpose of the present study was to apply the Rasch model, to a grammar test to analyze items’ distractors of the test. To achieve this, the current study investigated the quality of 10 instructor-written MC grammar items used in an undergraduate final exam, using the items responses of 310 English as a foreign language (EFL) students who had taken part in an advanced grammar course. The results showed the acceptable fit to the Rasch model and high reliability. Malfunctioning distractors were identified.