Generalized Twin Gaussian processes using Sharma–Mittal divergence

Mohamed Elhoseiny, Ahmed Elgammal

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

There has been a growing interest in mutual information measures due to their wide range of applications in machine learning and computer vision. In this paper, we present a generalized structured regression framework based on Sharma–Mittal (SM) divergence, a relative entropy measure, which is introduced to in the machine learning community in this work. SM divergence is a generalized mutual information measure for the widely used Rényi, Tsallis, Bhattacharyya, and Kullback–Leibler (KL) relative entropies. Specifically, we study SM divergence as a cost function in the context of the Twin Gaussian processes (TGP) (Bo and Sminchisescu 2010), which generalizes over the KL-divergence without computational penalty. We show interesting properties of Sharma–Mittal TGP (SMTGP) through a theoretical analysis, which covers missing insights in the traditional TGP formulation. However, we generalize this theory based on SM-divergence instead of KL-divergence which is a special case. Experimentally, we evaluated the proposed SMTGP framework on several datasets. The results show that SMTGP reaches better predictions than KL-based TGP, since it offers a bigger class of models through its parameters that we learn from the data.

Original languageEnglish (US)
Pages (from-to)399-424
Number of pages26
JournalMachine Learning
Volume100
Issue number2-3
DOIs
StatePublished - Sep 17 2015

All Science Journal Classification (ASJC) codes

  • Software
  • Artificial Intelligence

Keywords

  • Image reconstruction
  • Pose estimation
  • Sharma–Mittal entropy
  • Structured regression
  • Twin Gaussian processes

Fingerprint Dive into the research topics of 'Generalized Twin Gaussian processes using Sharma–Mittal divergence'. Together they form a unique fingerprint.

Cite this