### Abstract

The network topology of neurons in the brain exhibits an abundance of feedback connections, but the computational function of these feedback connections is largely unknown. We present a computational theory that characterizes the gain in computational power achieved through feedback in dynamical systems with fading memory. It implies that many such systems acquire through feedback universal computational capabilities for analog computing with a non-fading memory. In particular, we show that feedback enables such systems to process time-varying input streams in diverse ways according to rules that are implemented through internal states of the dynamical system. In contrast to previous attractor-based computational models for neural networks, these flexible internal states are high-dimensional attractors of the circuit dynamics, that still allow the circuit state to absorb new information from online input streams. In this way one arrives at novel models for working memory, integration of evidence, and reward expectation in cortical circuits. We show that they are applicable to circuits of conductance-based Hodgkin-Huxley (HH) neurons with high levels of noise that reflect experimental data on in-vivo conditions.

Original language | English (US) |
---|---|

Title of host publication | Advances in Neural Information Processing Systems 18 - Proceedings of the 2005 Conference |

Pages | 835-842 |

Number of pages | 8 |

State | Published - Dec 1 2005 |

Event | 2005 Annual Conference on Neural Information Processing Systems, NIPS 2005 - Vancouver, BC, Canada Duration: Dec 5 2005 → Dec 8 2005 |

### Publication series

Name | Advances in Neural Information Processing Systems |
---|---|

ISSN (Print) | 1049-5258 |

### Other

Other | 2005 Annual Conference on Neural Information Processing Systems, NIPS 2005 |
---|---|

Country | Canada |

City | Vancouver, BC |

Period | 12/5/05 → 12/8/05 |

### Fingerprint

### All Science Journal Classification (ASJC) codes

- Computer Networks and Communications
- Information Systems
- Signal Processing

### Cite this

*Advances in Neural Information Processing Systems 18 - Proceedings of the 2005 Conference*(pp. 835-842). (Advances in Neural Information Processing Systems).

}

*Advances in Neural Information Processing Systems 18 - Proceedings of the 2005 Conference.*Advances in Neural Information Processing Systems, pp. 835-842, 2005 Annual Conference on Neural Information Processing Systems, NIPS 2005, Vancouver, BC, Canada, 12/5/05.

**Principles of real-time computing with feedback applied to cortical microcircuit models.** / Maass, Wolfgang; Joshi, Prashant; Sontag, Eduardo.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

TY - GEN

T1 - Principles of real-time computing with feedback applied to cortical microcircuit models

AU - Maass, Wolfgang

AU - Joshi, Prashant

AU - Sontag, Eduardo

PY - 2005/12/1

Y1 - 2005/12/1

N2 - The network topology of neurons in the brain exhibits an abundance of feedback connections, but the computational function of these feedback connections is largely unknown. We present a computational theory that characterizes the gain in computational power achieved through feedback in dynamical systems with fading memory. It implies that many such systems acquire through feedback universal computational capabilities for analog computing with a non-fading memory. In particular, we show that feedback enables such systems to process time-varying input streams in diverse ways according to rules that are implemented through internal states of the dynamical system. In contrast to previous attractor-based computational models for neural networks, these flexible internal states are high-dimensional attractors of the circuit dynamics, that still allow the circuit state to absorb new information from online input streams. In this way one arrives at novel models for working memory, integration of evidence, and reward expectation in cortical circuits. We show that they are applicable to circuits of conductance-based Hodgkin-Huxley (HH) neurons with high levels of noise that reflect experimental data on in-vivo conditions.

AB - The network topology of neurons in the brain exhibits an abundance of feedback connections, but the computational function of these feedback connections is largely unknown. We present a computational theory that characterizes the gain in computational power achieved through feedback in dynamical systems with fading memory. It implies that many such systems acquire through feedback universal computational capabilities for analog computing with a non-fading memory. In particular, we show that feedback enables such systems to process time-varying input streams in diverse ways according to rules that are implemented through internal states of the dynamical system. In contrast to previous attractor-based computational models for neural networks, these flexible internal states are high-dimensional attractors of the circuit dynamics, that still allow the circuit state to absorb new information from online input streams. In this way one arrives at novel models for working memory, integration of evidence, and reward expectation in cortical circuits. We show that they are applicable to circuits of conductance-based Hodgkin-Huxley (HH) neurons with high levels of noise that reflect experimental data on in-vivo conditions.

UR - http://www.scopus.com/inward/record.url?scp=84864035825&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84864035825&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:84864035825

SN - 9780262232531

T3 - Advances in Neural Information Processing Systems

SP - 835

EP - 842

BT - Advances in Neural Information Processing Systems 18 - Proceedings of the 2005 Conference

ER -