Learning ambiguities using bayesian mixture of experts

Atul Kanaujia, Dimitri Metaxas

Research output: Chapter in Book/Report/Conference proceedingConference contribution

7 Scopus citations

Abstract

Mixture of Experts(ME) is an ensemble of function approximators that fit the clustered data set locally rather than globally. ME provides a useful tool to learn multi-valued mappings(ambiguities) in the data set. Mixture of Experts training involve learning a multi-category classifier for the gates distribution and fitting a regressor within each of the clusters. The learning of ME is based on divide and conquer which is known to increase the error due to variance. In order to avoid overfitting several researchers have proposed using linear experts. However in the absence of any knowledge of non-linearities existing in the data set, it is not clear how many linear experts could accurately model the data. In this work we propose a bayesian learning framework for learning Mixture of Experts. Bayesian learning intrinsically embodies regularization and model selection using Occam's razor. In the past Bayesian learning methods have been applied to classification and regression in order to avoid scale sensitivity and orthodox model selection procedure of cross validation. Although true Bayesian learning is computationally intractable, approximations do result in sparser and more compact models.

Original languageEnglish (US)
Title of host publicationProcedings - 18th IEEE International Conference on Tools with Artificial Intelligence, ICTAI 2006
Pages436-440
Number of pages5
DOIs
StatePublished - Dec 1 2006
Event18th IEEE International Conference on Tools with Artificial Intelligence, ICTAI 2006 - Arlington, VA, United States
Duration: Oct 13 2006Oct 15 2006

Publication series

NameProceedings - International Conference on Tools with Artificial Intelligence, ICTAI
ISSN (Print)1082-3409

Other

Other18th IEEE International Conference on Tools with Artificial Intelligence, ICTAI 2006
CountryUnited States
CityArlington, VA
Period10/13/0610/15/06

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Fingerprint Dive into the research topics of 'Learning ambiguities using bayesian mixture of experts'. Together they form a unique fingerprint.

  • Cite this

    Kanaujia, A., & Metaxas, D. (2006). Learning ambiguities using bayesian mixture of experts. In Procedings - 18th IEEE International Conference on Tools with Artificial Intelligence, ICTAI 2006 (pp. 436-440). [4031928] (Proceedings - International Conference on Tools with Artificial Intelligence, ICTAI). https://doi.org/10.1109/ICTAI.2006.73