Increasing the efficiency of a neural network through unlearning

J. L. Van Hemmen, L. B. Ioffe, R. Kühn, M. Vaas

Research output: Contribution to journalArticlepeer-review

31 Scopus citations

Abstract

It has been suggested that dream (REM) sleep leads to unlearning of parasitic or spurious states. Here we present the results of an extensive numerical study of unlearning in a network of formal neurons (Ising spins) whose activity may vary. Our results are threefold. First, unlearning greatly improves the performance of the network; e.g., the storage capacity may be more than quadrupled. Second, the optimal number of unlearning steps ("dreams") does not depend on the activity. Third, using the simplest form of Hebbian learning, the network can store and retrieve patterns whose activity differs. A microscopic picture of the underlying processes is presented.

Original languageEnglish (US)
Pages (from-to)386-392
Number of pages7
JournalPhysica A: Statistical Mechanics and its Applications
Volume163
Issue number1
DOIs
StatePublished - Feb 1 1990

All Science Journal Classification (ASJC) codes

  • Statistics and Probability
  • Condensed Matter Physics

Fingerprint

Dive into the research topics of 'Increasing the efficiency of a neural network through unlearning'. Together they form a unique fingerprint.

Cite this