Analog neural nets with Gaussian or other common noise distributions cannot recognize arbitrary regular languages

Wolfgang Maass, Eduardo D. Sontag

Research output: Contribution to journalArticlepeer-review

41 Scopus citations

Abstract

We consider recurrent analog neural nets where the output of each gate is subject to gaussian noise or any other common noise distribution that is nonzero on a sufficiently large part of the state-space. We show that many regular languages cannot be recognized by networks of this type, and we give a precise characterization of languages that can be recognized. This result implies severe constraints on possibilities for constructing recurrent analog neural nets that are robust against realistic types of analog noise. On the other hand, we present a method for constructing feedforward analog neural nets that are robust with regard to analog noise of this type.

Original languageEnglish (US)
Pages (from-to)771-782
Number of pages12
JournalNeural computation
Volume11
Issue number3
DOIs
StatePublished - Apr 1 1999

All Science Journal Classification (ASJC) codes

  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience

Fingerprint

Dive into the research topics of 'Analog neural nets with Gaussian or other common noise distributions cannot recognize arbitrary regular languages'. Together they form a unique fingerprint.

Cite this