Title: Assessment analysis: methods and implementation options for multiple-choice exams

Authors: Annastiina Rintala

Addresses: School of Engineering Science, LUT University, Lappeenranta, FI-53850, Finland

Abstract: Automatic assessment can reduce teacher workload and offer flexibility for students, but if the teacher does not assess the exams manually, the teacher's view of the students' competence and exam-related behaviours will be meagre. This drawback can be mitigated through appropriate analytics. To support the design of an analytics module for an electronic assessment system, this paper investigates what kinds of analyses are useful for multiple-choice exams and how the analysis can be implemented. Three types of analysis were found useful: 1) descriptive statistics of exam answers; 2) analysis of errors in answers; and 3) analysis of students' exam-taking behaviours. Though these analyses are to some extent generalisable, analysis needs vary, for example, by time, exam type and user. Therefore, it is suggested that to enable user-specific analyses in a resource-efficient manner, assessment software providers should facilitate access to assessment data in a structured format.

Keywords: multiple-choice exams; automatic assessment; electronic assessment; analytics tools; design science; electronic learning; learning analytics; assessment analytics; innovation; education.

DOI: 10.1504/IJIIE.2023.128460

International Journal of Innovation in Education, 2023 Vol.8 No.1, pp.20 - 39

Received: 11 Nov 2021
Accepted: 08 Jun 2022

Published online: 23 Jan 2023 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article