On Loss Functions and Regret Bounds for Multi-Category Classification

Zhiqiang Tan, Xinwei Zhang

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

We develop new approaches in multi-class settings for constructing loss functions and establishing corresponding regret bounds with respect to the zero-one or cost-weighted classification loss. We provide new general representations of losses by deriving inverse mappings from a concave generalized entropy to a loss through a convex dissimilarity function related to the multi-distribution $f$-divergence. This approach is then applied to study both hinge-like losses and proper scoring rules. In the first case, we derive new hinge-like convex losses, which are tighter extensions outside the probability simplex than related hinge-like losses and geometrically simpler with fewer non-differentiable edges. We also establish a classification regret bound in general for all losses with the same generalized entropy as the zero-one loss, thereby substantially extending and improving existing results. In the second case, we identify new sets of multi-class proper scoring rules through different types of dissimilarity functions and reveal interesting relationships between various composite losses currently in use. We also establish new classification regret bounds in general for multi-class proper scoring rules and, as applications, provide simple meaningful regret bounds for two specific sets of proper scoring rules. These results generalize, for the first time, previous two-class regret bounds to multi-class settings.

Original languageEnglish (US)
Pages (from-to)5295-5313
Number of pages19
JournalIEEE Transactions on Information Theory
Volume68
Issue number8
DOIs
StatePublished - Aug 1 2022

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Keywords

  • Boosting
  • Bregman divergence
  • composite loss
  • exponential loss
  • f-divergence
  • generalized entropy
  • hinge loss
  • proper scoring rule
  • surrogate regret bounds

Fingerprint

Dive into the research topics of 'On Loss Functions and Regret Bounds for Multi-Category Classification'. Together they form a unique fingerprint.

Cite this