Title: Improving assessment evidence in e-learning products: some solutions for reliability

Authors: Kathleen Scalise, Tara Madhyastha, Jim Minstrell, Mark Wilson

Addresses: Department of Educational Methodology, Policy and Leadership, University of Oregon, Eugene, OR 97403, USA. ' Facet Innovations, LLC, 1314 NE, 43rd St., Suite 207, Seattle, WA 98105, USA. ' Facet Innovations, LLC, 1314 NE, 43rd St., Suite 207, Seattle, WA 98105, USA. ' University of California, Berkeley, CA 94720, USA

Abstract: E-learning products such as cognitive diagnosers interact with learners and collect assessment data to build a picture of some aspect of a learner|s thinking. One concern for this rapidly emerging area of e-learning is whether the diagnostic conclusions of such products are based on sound evidence, including whether or not the diagnostics are reliable. In online settings, the information may be used for adaptive delivery of content, individualising learning materials, dynamic feedback, teacher feed-forward, cognitive mapping, score reporting and course placement. A reliability index quantifies the impact that measurement error at the individual level may have on the accuracy of the inference. This paper investigates some simple solutions that substantially improve reliability within one e-learning product. These solutions include providing questions of appropriate difficulty that help to maximise item information across the distribution.

Keywords: student assessment; e-learning; cognitive diagnosis; reliability; adaptive delivery; personalised learning; online learning; feedback; measurement; diagnostics; physics education; electronic learning.

DOI: 10.1504/IJLT.2010.034549

International Journal of Learning Technology, 2010 Vol.5 No.2, pp.191 - 208

Published online: 07 Aug 2010 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article