Cross evaluation - A pilot application of a new evaluation mechanism

Ying Sun, Paul Kantor, Tomek Strzalkowski, Robert Rittman, Nina Wacholder

Research output: Contribution to journalArticle

3 Scopus citations

Abstract

The work reports some initial success in extending the Rutgers Paradigm of IR evaluation to the realm of concrete measurement, not in information retrieval per se, but in the arguably more complex domain of Question Answering. Crucial to the paradigm are two components: cross evaluation, and an analytical model that controls for the potential problems of cross evaluation. We describe the experimental design and analytical models. In the models, interaction effects are examined and found not to be important. After eliminating the interaction effects, we are able to extract meaningful and useful results from a very small study involving just three analysts, five topics, and two "systems".

Original languageEnglish (US)
Pages (from-to)383-392
Number of pages10
JournalProceedings of the ASIST Annual Meeting
Volume41
DOIs
StatePublished - Nov 1 2004

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Library and Information Sciences

Fingerprint Dive into the research topics of 'Cross evaluation - A pilot application of a new evaluation mechanism'. Together they form a unique fingerprint.

  • Cite this