Backpropagation separates when perceptrons do

Research output: Contribution to conferencePaperpeer-review

21 Scopus citations

Abstract

Consideration is given to the behavior of the least-squares problem that arises when one attempts to train a feedforward net with no hidden neurons. It is assumed that the net has monotonic nonlinear output units. Under the assumption that a training set is separable, that is, that there is a set of achievable outputs for which the error is zero, the authors show that there are no nonglobal minima. More precisely, they assume that the error is of a threshold least-mean square (LMS) type, in that the error function is zero for values beyond the target value. The authors' proof gives, in addition, the following stronger result: the continuous gradient adjustment procedure is such that from any initial weight configuration a separating set of weights is obtained in finite time. Thus they have a precise analog of the perceptron learning theorem. The authors contrast their results with the more classical pattern recognition problem of threshold LMS with linear output units.

Original languageEnglish (US)
Pages639-642
Number of pages4
StatePublished - 1989
EventIJCNN International Joint Conference on Neural Networks - Washington, DC, USA
Duration: Jun 18 1989Jun 22 1989

Other

OtherIJCNN International Joint Conference on Neural Networks
CityWashington, DC, USA
Period6/18/896/22/89

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Fingerprint

Dive into the research topics of 'Backpropagation separates when perceptrons do'. Together they form a unique fingerprint.

Cite this