Computational model for interactions between auditory and visual motion mechanisms

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Jain et al. [1] showed that a strong motion signal in one modality (visual/auditory) influences the perception of a simultaneously present weak motion signal in the other modality (auditory/visual) for motion along the three cardinal axes. Furthermore, they also observed auditory crossmodal motion aftereffects (MAE) for all directions and visual crossmodal MAE for vertical direction of motion. We developed a neurophysiologically plausible computation model to explain the observed interactions. The current model was based on the hypothesis that these crossmodal interactions are mediated by the higher integrative multimodal areas, such as the superior colliculus (SC), through feedback connections to the lower unimodal areas in the human brain.

Original languageEnglish (US)
Title of host publicationNEBEC 2009 - Proceedings of the IEEE 35th Annual Northeast Bioengineering Conference
DOIs
StatePublished - 2009
EventIEEE 35th Annual Northeast Bioengineering Conference, NEBEC 2009 - Boston, MA, United States
Duration: Apr 3 2009Apr 5 2009

Publication series

NameBioengineering, Proceedings of the Northeast Conference
ISSN (Print)1071-121X

Other

OtherIEEE 35th Annual Northeast Bioengineering Conference, NEBEC 2009
Country/TerritoryUnited States
CityBoston, MA
Period4/3/094/5/09

All Science Journal Classification (ASJC) codes

  • Bioengineering

Fingerprint

Dive into the research topics of 'Computational model for interactions between auditory and visual motion mechanisms'. Together they form a unique fingerprint.

Cite this