A human vision based computational model for chromatic texture segregation

Thomas V. Papathomas, Ramanujan S. Kashi, Andrei Gorea

Research output: Contribution to journalArticlepeer-review

24 Scopus citations

Abstract

We have developed a computational model for texture perception which has physiological relevance and correlates well with human performance. The model attempts to simulate the visual processing characteristics by incorporating mechanisms tuned to detect luminance-polarity, orientation, spatial frequency and color, which are characteristic features of any textural image. We obtained a very good correlation between the model's simulation results and data from psychophysical experiments with a systematically selected set of visual stimuli with texture patterns defined by spatial variations in color, luminance, and orientation. In addition, the model predicts correctly texture segregation performance with key benchmarks and natural textures. This represents a first effort to incorporate chromatic signals in texture segregation models of psychophysical relevance, most of which have treated grey-level images so far. Another novel feature of the model is the extension of the concept of spatial double opponency to domains beyond color, such as orientation and spatial frequency. The model has potential applications in the areas of image processing, machine vision and pattern recognition, and scientific visualization.

Original languageEnglish (US)
Pages (from-to)428-440
Number of pages13
JournalIEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Volume27
Issue number3
DOIs
StatePublished - 1997

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Software
  • Information Systems
  • Human-Computer Interaction
  • Computer Science Applications
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'A human vision based computational model for chromatic texture segregation'. Together they form a unique fingerprint.

Cite this