Computer-Based Testing: Score Equivalence and Testing Administration Mode Preference in a Comparative Evaluation Study

Authors

  • Hooshang Khoshsima Associate Professor, Language Department, Faculty of Management and Humanities, Chabahar Maritime University, Chabahar, Iran
  • Seyyed Morteza Hashemi Toroujeni M.A. In TEFL, Language Department, Faculty of Management and Humanities, Chabahar Maritime University, Chabahar, Iran

DOI:

https://doi.org/10.3991/ijet.v12i10.6875

Keywords:

Computer-Based Testing, Testing Mode Preference

Abstract


The empirical evidences show that two identical Computer-Based Testing (henceforth CBT) and Paper-and-Pencil-Based Testing (henceforth PBT) do not always result in the same scores. Such conclusions are referred to as “the effect of testing administration mode” or “testing mode effect”. Moderators such as individual differences (e.g., prior computer experience or computer attitude) have been investigated [4] to see if they influence test takers’ performance. The Guidelines for Computer-Based Tests and Interpretations [1] recommended eliminating the possible effects of some moderator variables on test takers performance. This research was conducted to provide the required empirical evidences on the existence of distinctive effects caused by changing administration mode from conventional PBT to modern CBT. The relationship between testing mode preference on test takers’ CBT performance was also examined. Two equivalent tests and two questionnaires were used. Using descriptive statistics and ANOVA, the findings demonstrated that two CBT and PBT sets of scores were comparable. Additionally, prior testing mode preference and gender had no significant effect on test takers’ CBT score, and they were not considered the variables that might affect the performance on CBT.

Downloads

Published

2017-11-02

How to Cite

Khoshsima, H., & Hashemi Toroujeni, S. M. (2017). Computer-Based Testing: Score Equivalence and Testing Administration Mode Preference in a Comparative Evaluation Study. International Journal of Emerging Technologies in Learning (iJET), 12(10), pp. 35–55. https://doi.org/10.3991/ijet.v12i10.6875

Issue

Section

Papers