A hybrid 3D segmentation framework

Dimitris Metaxas, Ting Chen

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

Classical segmentation methods such as region-based methods and boundary-based methods cannot make full use of the information provided in an image. In this paper we present our hybrid framework that allows us to solve a large variety of 3D medical image segmentation problems. The framework integrates Gibbs Prior models, marching cubes, and deformable models. First, Gibbs Prior models are applied onto each slice in a medical volume and the segmentation results are combined to form a 3D binary mask of the object. Then we create a deformable mesh based on this 3D binary mask using marching cubes. The deformable model will then converge to edge features in the volume with the help of image derived external forces. Then we update the parameters of Gibbs Prior models using the deformable segmentation result. These modules work recursively to reach a segmentation solution. The hybrid segmentation framework is implemented using ITK and has been used to segment various clinical objects. Finally, we demonstrate high quality 3D segmentation results.

Original languageEnglish (US)
Title of host publication2004 2nd IEEE International Symposium on Biomedical Imaging
Subtitle of host publicationMacro to Nano
Pages13-16
Number of pages4
StatePublished - 2004
Event2004 2nd IEEE International Symposium on Biomedical Imaging: Macro to Nano - Arlington, VA, United States
Duration: Apr 15 2004Apr 18 2004

Publication series

Name2004 2nd IEEE International Symposium on Biomedical Imaging: Macro to Nano
Volume1

Other

Other2004 2nd IEEE International Symposium on Biomedical Imaging: Macro to Nano
Country/TerritoryUnited States
CityArlington, VA
Period4/15/044/18/04

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Fingerprint

Dive into the research topics of 'A hybrid 3D segmentation framework'. Together they form a unique fingerprint.

Cite this