On the Number of Memories that can be Perfectly Stored in a Neural Net with Hebb Weights

  • H. J. Sussmann

Research output: Contribution to journalLetterpeer-review

Abstract

Let {wij} be the weights of the connections of a neural network with n nodes, calculated from m data vectors v1,…, vm in {1,-1}, according to the Hebb rule. We prove that, if m is not too large relative to n and the vk are random, then the wij constitute, with high probability, a perfect representation of the vk in the sense that the vk are completely determined by the wij up to their sign. The conditions under which this is established turn out to be less restrictive than those under which it has been shown that the vk can actually be recovered by letting the network evolve until equilibrium is attained. In the specific case where the entries of the vk are independent and equal to 1 or - 1 with probability 1/2, the condition on m is that m should not exceed n/0.7log n.

Original languageEnglish (US)
Pages (from-to)174-178
Number of pages5
JournalIEEE Transactions on Information Theory
Volume35
Issue number1
DOIs
StatePublished - Jan 1989

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Fingerprint

Dive into the research topics of 'On the Number of Memories that can be Perfectly Stored in a Neural Net with Hebb Weights'. Together they form a unique fingerprint.

Cite this