Enhancement of hopfield neural networks using stochatic noise processes

Vladimir Pavlovic, Dan Schonfeld, Gary Friedman

Research output: Contribution to conferencePaper

1 Citation (Scopus)

Abstract

Hopfield neural networks (HNN) are a class of densely connected single layer nonlinear networks of perceptrons. The network's energy function is defined through a learning procedure so that its minima coincide with states from a predefined set. However, because of the network's nonlinearity a number of undesirable local energy minima emerge from the learning procedure. This has shown to significantly effect the network's performance. In this work we present a stochastic process enhanced bipolar HNN. Presence of the stochastic process in the network enables us to describe its evolution using the Markov chains theory. When faced with a fixed network topology, the desired final distribution of states can be reached by modulating the network's stochastic process. Guided by the desired final distribution, we propose a general L2 norm error density function optimization criterion for enhancement of the Hopfield neural network performance. This criterion can also be viewed in terms of stability intervals associated with the desired and non-desired stable states of the network. Because of the complexity of the general criterion we relax the optimization to the set of non-desired states. We further formulate a stochastic process design based on the stability intervals, which satisfies the optimization criterion and results in enhanced performance of the network. Our experimental simulations confirm the predicted improvement in performance.

Original languageEnglish (US)
Pages173-182
Number of pages10
StatePublished - Dec 1 2001
Externally publishedYes

Fingerprint

Hopfield neural networks
Random processes
Network performance
Nonlinear networks
Markov processes
Probability density function
Process design
Topology
Neural networks

All Science Journal Classification (ASJC) codes

  • Signal Processing
  • Software
  • Electrical and Electronic Engineering

Cite this

@conference{f9f2044448004b9f87aef7e0498bebd0,
title = "Enhancement of hopfield neural networks using stochatic noise processes",
abstract = "Hopfield neural networks (HNN) are a class of densely connected single layer nonlinear networks of perceptrons. The network's energy function is defined through a learning procedure so that its minima coincide with states from a predefined set. However, because of the network's nonlinearity a number of undesirable local energy minima emerge from the learning procedure. This has shown to significantly effect the network's performance. In this work we present a stochastic process enhanced bipolar HNN. Presence of the stochastic process in the network enables us to describe its evolution using the Markov chains theory. When faced with a fixed network topology, the desired final distribution of states can be reached by modulating the network's stochastic process. Guided by the desired final distribution, we propose a general L2 norm error density function optimization criterion for enhancement of the Hopfield neural network performance. This criterion can also be viewed in terms of stability intervals associated with the desired and non-desired stable states of the network. Because of the complexity of the general criterion we relax the optimization to the set of non-desired states. We further formulate a stochastic process design based on the stability intervals, which satisfies the optimization criterion and results in enhanced performance of the network. Our experimental simulations confirm the predicted improvement in performance.",
author = "Vladimir Pavlovic and Dan Schonfeld and Gary Friedman",
year = "2001",
month = "12",
day = "1",
language = "English (US)",
pages = "173--182",

}

Enhancement of hopfield neural networks using stochatic noise processes. / Pavlovic, Vladimir; Schonfeld, Dan; Friedman, Gary.

2001. 173-182.

Research output: Contribution to conferencePaper

TY - CONF

T1 - Enhancement of hopfield neural networks using stochatic noise processes

AU - Pavlovic, Vladimir

AU - Schonfeld, Dan

AU - Friedman, Gary

PY - 2001/12/1

Y1 - 2001/12/1

N2 - Hopfield neural networks (HNN) are a class of densely connected single layer nonlinear networks of perceptrons. The network's energy function is defined through a learning procedure so that its minima coincide with states from a predefined set. However, because of the network's nonlinearity a number of undesirable local energy minima emerge from the learning procedure. This has shown to significantly effect the network's performance. In this work we present a stochastic process enhanced bipolar HNN. Presence of the stochastic process in the network enables us to describe its evolution using the Markov chains theory. When faced with a fixed network topology, the desired final distribution of states can be reached by modulating the network's stochastic process. Guided by the desired final distribution, we propose a general L2 norm error density function optimization criterion for enhancement of the Hopfield neural network performance. This criterion can also be viewed in terms of stability intervals associated with the desired and non-desired stable states of the network. Because of the complexity of the general criterion we relax the optimization to the set of non-desired states. We further formulate a stochastic process design based on the stability intervals, which satisfies the optimization criterion and results in enhanced performance of the network. Our experimental simulations confirm the predicted improvement in performance.

AB - Hopfield neural networks (HNN) are a class of densely connected single layer nonlinear networks of perceptrons. The network's energy function is defined through a learning procedure so that its minima coincide with states from a predefined set. However, because of the network's nonlinearity a number of undesirable local energy minima emerge from the learning procedure. This has shown to significantly effect the network's performance. In this work we present a stochastic process enhanced bipolar HNN. Presence of the stochastic process in the network enables us to describe its evolution using the Markov chains theory. When faced with a fixed network topology, the desired final distribution of states can be reached by modulating the network's stochastic process. Guided by the desired final distribution, we propose a general L2 norm error density function optimization criterion for enhancement of the Hopfield neural network performance. This criterion can also be viewed in terms of stability intervals associated with the desired and non-desired stable states of the network. Because of the complexity of the general criterion we relax the optimization to the set of non-desired states. We further formulate a stochastic process design based on the stability intervals, which satisfies the optimization criterion and results in enhanced performance of the network. Our experimental simulations confirm the predicted improvement in performance.

UR - http://www.scopus.com/inward/record.url?scp=0035784049&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0035784049&partnerID=8YFLogxK

M3 - Paper

SP - 173

EP - 182

ER -