The Retrofit of an English Language Placement Test Used for Large-scale Assessments in Higher Education

Document Type : Original Research Article


National Autonomous University of Mexico


Language placement tests (LPTs) are used to assess students’ proficiency in a progressive manner in the target language. Based on their performance, students are assigned to stepped language courses. These tests are usually considered low stakes because they do not have significant consequences in students’ lives, which is perhaps the reason why studies conducted with LPTs are scarce. Nevertheless, tests should be regularly examined, and statistical analysis should be conducted to assess their functioning, particularly when they have a medium or high-stakes impact. In the case of LPTs administered on a large-scale, the logistic and administrative consequences of an ill-defined test may lead to an economic burden and unnecessary use of human resources which can also affect students negatively. This study was undertaken at one of the largest public institutions in Latin America. Nearly 1700 students sit an English LPT every academic semester. A diagnostic statistical analysis revealed a need for revision. To retrofit the test, a new test architecture and blueprints were designed in adherence to the new curriculum, and new items were developed and tried out gradually in several pilot studies. Item Response Theory (IRT) was used to examine the functioning of the new test items. The aim of this study is to show how the test was retrofitted, and to compare the functioning of the retrofitted version of the English LPT with the previous one. The results show that the quality of items was higher than that of the former English LPT.