Self-Adapted Testing: A Performance-Improving Variant of Computerized Adaptive Testing

Research output: Contribution to journalArticlepeer-review

41 Scopus citations

Abstract

We conducted an experiment that contrasted a variant of computerized adaptive testing, self-adapted testing, with two traditional tests, a relatively difficult one and a relatively easy one, that were constructed from the same bank of verbal ability items. In a self-adapted test, the examinee, rather than a computerized algorithm, chooses the difficulty of the next item to be presented. Participants completed a self-report of text anxiety and were randomly assigned to take one of the three tests of verbal ability. Analyses of variance using Rasch estimates of ability and the standard error of those estimates as dependent measures demonstrated that the self-adapted test led to higher ability estimates and minimized the effect of test anxiety without any overall loss of measurement precision. Analysis of the item choices in the self-adapted test suggested that, in general, participants chose more difficult items as the test progressed. Anxiety was negatively associated with the difficulty of the initial choice but not associated with the rate of progress to higher difficulty items.

Original languageEnglish (US)
Pages (from-to)315-319
Number of pages5
JournalJournal of Educational Psychology
Volume79
Issue number3
DOIs
StatePublished - Sep 1987
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Education
  • Developmental and Educational Psychology

Fingerprint

Dive into the research topics of 'Self-Adapted Testing: A Performance-Improving Variant of Computerized Adaptive Testing'. Together they form a unique fingerprint.

Cite this