Signal detection theory (SDT), the standard mathematical framework by which we understand how stimuli are classified into distributions such as signal or noise, is an essential part of the modern psychologist’s toolkit. This article introduces some mathematical tools derived from information theory which allow surprisingly simple approximations to key quantities in SDT. The main idea is a lower bound on the probability of correct classification of a stimulus, as a function of information-theoretic properties of the generating distribution. This bound depends on three distinct factors, each of which can be quantified information-theoretically: (a) The prior uncertainty in the choice of generating distribution; (b) the inherent separability of the classes; and (c) the discrepancy between the observer’s model of the class distributions and the “true” model. The bound is only a loose substitute for the conventional method for computing proportion correct (via integration) but generalizes readily to multiple dimensions and larger numbers of stimulus categories, where direct integration is computationally difficult. Moreover, unlike most conventional SDT formulae, this bound does not require Gaussian distributions. Most importantly, the information-theoretic signal detection theory (IT-SDT) framework sheds light on the way classification performance depends on the discrepancy between the observer’s assumptions and those actually governing the environment.
All Science Journal Classification (ASJC) codes
- Bayesian inference
- information theory
- signal detection theory