On solving the local minima problem of adaptive learning by using deterministic weight evolution algorithm

S. C. Ng, S. H. Leung

Research output: Contribution to conferencePaperpeer-review

6 Citations (Scopus)

Abstract

This paper continues the discussion of weight evolution algorithm for solving the local minimum problem of back-propagation by changing the weights of a multi-layer neural network in a deterministic way. During the learning phase of back-propagation, the network weights are adjusted intentionally in order to have an improvement in system performance. The idea is to work backward from the error components and the system outputs to deduce a deterministic perturbation on particular network weights for optimization purpose. Simulation results show that the weight evolution algorithm always outperforms the other traditional methods in achieving global convergence. From mathematical analysis, it can be found that the weight evolution between the hidden and output layers can accelerate the convergence speed, whereas the weight evolution between the input and hidden layers can assist in solving the local minima problem.

Original languageEnglish
Pages251-255
Number of pages5
Publication statusPublished - 2001
EventCongress on Evolutionary Computation 2001 - Soul, Korea, Republic of
Duration: 27 May 200130 May 2001

Conference

ConferenceCongress on Evolutionary Computation 2001
Country/TerritoryKorea, Republic of
CitySoul
Period27/05/0130/05/01

Keywords

  • Back-propagation Learning
  • Neural networks
  • Weight Evolution

Fingerprint

Dive into the research topics of 'On solving the local minima problem of adaptive learning by using deterministic weight evolution algorithm'. Together they form a unique fingerprint.

Cite this