Large scale learning of active shape models

Atul Kanaujia, Dimitris N. Metaxas

Research output: Chapter in Book/Report/Conference proceedingConference contribution

10 Scopus citations

Abstract

We propose a framework to learn statistical shape models for faces as piecewise linear models. Specifically, our methodology builds upon primitive active shape models(ASM) to handle large scale variation in shapes and appearances of faces. Non-linearities in shape manifold arising due to large head rotation cannot be accurately modeled using ASM. Moreover overly general image descriptor causes the cost function to have multiple local minima which in turn degrades the quality of shape registration. We propose to use multiple overlapping subspaces with more discriminative local image descriptors to capture larger variance occurring in the data set. We also apply techniques to learn distance metric for enhancing similarity of descriptors belonging to the same class of shape subspace. Our generic algorithm can be applied to large scale shape analysis and registration.

Original languageEnglish (US)
Title of host publication2007 IEEE International Conference on Image Processing, ICIP 2007 Proceedings
PublisherIEEE Computer Society
Pages265-268
Number of pages4
ISBN (Print)1424414377, 9781424414376
DOIs
StatePublished - 2006
Event14th IEEE International Conference on Image Processing, ICIP 2007 - San Antonio, TX, United States
Duration: Sep 16 2007Sep 19 2007

Publication series

NameProceedings - International Conference on Image Processing, ICIP
Volume1
ISSN (Print)1522-4880

Other

Other14th IEEE International Conference on Image Processing, ICIP 2007
Country/TerritoryUnited States
CitySan Antonio, TX
Period9/16/079/19/07

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Keywords

  • Active shape models
  • Anderson darling statistics
  • Relevance component analysis
  • SIFT

Fingerprint

Dive into the research topics of 'Large scale learning of active shape models'. Together they form a unique fingerprint.

Cite this