Evolution strategies on connection weights into modified gradient function for multi-layer neural networks

S. C. Ng, S. H. Leung, Andrew Luk

Research output: Contribution to conferencePaperpeer-review

1 Citation (Scopus)

Abstract

In this paper, two modifications on the conventional back-propagation algorithm for feedforward multi-layer neural networks are presented. One modification is based on the calculation of the gradient function, while the other one is the use of evolution strategies on connection weights into the gradient search algorithm. From simulation results, the new modified algorithm always converges to the global optimal solution with better performance when compared with other fast learning algorithms and global search methods.

Original languageEnglish
Pages1371-1376
Number of pages6
DOIs
Publication statusPublished - 2005
EventInternational Joint Conference on Neural Networks, IJCNN 2005 - Montreal, QC, Canada
Duration: 31 Jul 20054 Aug 2005

Conference

ConferenceInternational Joint Conference on Neural Networks, IJCNN 2005
Country/TerritoryCanada
CityMontreal, QC
Period31/07/054/08/05

Fingerprint

Dive into the research topics of 'Evolution strategies on connection weights into modified gradient function for multi-layer neural networks'. Together they form a unique fingerprint.

Cite this