Title: LMS assessment: using IRT analysis to detect defective multiple-choice test items

Authors: Panagiotis Fotaris; Theodoros Mastoras

Addresses: School of Arts and Digital Industries, University of East London, Docklands Campus, University Way, London E16 2RD, UK ' Department of Applied Informatics, University of Macedonia, 156 Egnatia str., 54006 Thessaloniki, Greece

Abstract: Due to the computerisation of assessment tests, the use of Item Response Theory (IRT) has become commonplace for educational assessment development, evaluation and refinement. When used appropriately by a Learning Management System (LMS), IRT can improve the assessment quality, increase the efficiency of the testing process and provide in-depth descriptions of item properties. This paper introduces a methodological and architectural framework which embeds an IRT analysis tool in an LMS so as to extend its functionality with assessment optimisation support. By applying a set of validity rules to the statistical indices produced by the IRT analysis, the enhanced LMS is able to detect several defective items from an item pool which are then reported for reviewing of their content. Assessment refinement is achieved by repeatedly employing this process until all flawed items are eliminated.

Keywords: e-learning; item pool optimisation; IRT; item response theory; computer-aided assessment; learning management systems; LMS assessment; technology enhanced learning; MOOCs; massive open online courses; multiple choice tests; defective test items; electronic learning; online learning.

DOI: 10.1504/IJTEL.2014.069015

International Journal of Technology Enhanced Learning, 2014 Vol.6 No.4, pp.281 - 296

Received: 05 Mar 2014
Accepted: 08 Oct 2014

Published online: 24 Apr 2015 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article