Abstract
This paper presents the integration of magnified gradient function and weight evolution algorithms in order to solve the local minima problem. The combination of the two algorithms will give a significant improvement in terms of convergence rate and global search capability as compared to some common fast learning algorithms such as the standard back-propagation, Quickprop, Resilent propagation (RPROP), SARPROP, and genetic algorithms (GA).
Original language | English |
---|---|
Pages | 767-772 |
Number of pages | 6 |
Publication status | Published - 2002 |
Event | 2002 International Joint Conference on Neural Networks (IJCNN '02) - Honolulu, HI, United States Duration: 12 May 2002 → 17 May 2002 |
Conference
Conference | 2002 International Joint Conference on Neural Networks (IJCNN '02) |
---|---|
Country/Territory | United States |
City | Honolulu, HI |
Period | 12/05/02 → 17/05/02 |