TY - GEN
T1 - Further enhancements in WOM algorithm to solve the local minimum and flat-spot problem in feed-forward neural networks
AU - Cheung, Chi Chung
AU - Ng, Sin Chun
AU - Lui, Andrew
AU - Xu, Sean Shensheng
N1 - Publisher Copyright:
© 2014 IEEE.
PY - 2014/9/3
Y1 - 2014/9/3
N2 - Backpropagation (BP) algorithm is very popular in supervised learning for feed-forward neural networks. However, it is sometimes slow and easily trapped into a local minimum or a flat-spot area (known as the local minimum and flat-spot area problem respectively). Many modifications have been proposed to speed up its convergence rate but they seldom improve the global convergence capability. Some fast learning algorithms have been proposed recently to solve these two problems: Wrong Output Modification (WOM) is one new algorithm that can improve the global convergence capability significantly. However, some limitations exist in WOM so that it cannot solve the local minimum and flat-spot problem effectively. In this paper, some enhancements are proposed to further improve the performance of WOM by (a) changing the mechanism to escape from a local minimum or a flat-spot area and (b) adding a fast checking procedure to identify the existence of a local minimum or a flat-spot area. The performance investigation shows that the proposed enhancements can improve the performance of WOM significantly when it is applied into different fast learning algorithms. Moreover, WOM with these enhancements is also applied to a very popular second-order gradient descent learning algorithm, Levenberg-Marquardt (LM) algorithm. The performance investigation shows that it can significantly improve the performance of LM.
AB - Backpropagation (BP) algorithm is very popular in supervised learning for feed-forward neural networks. However, it is sometimes slow and easily trapped into a local minimum or a flat-spot area (known as the local minimum and flat-spot area problem respectively). Many modifications have been proposed to speed up its convergence rate but they seldom improve the global convergence capability. Some fast learning algorithms have been proposed recently to solve these two problems: Wrong Output Modification (WOM) is one new algorithm that can improve the global convergence capability significantly. However, some limitations exist in WOM so that it cannot solve the local minimum and flat-spot problem effectively. In this paper, some enhancements are proposed to further improve the performance of WOM by (a) changing the mechanism to escape from a local minimum or a flat-spot area and (b) adding a fast checking procedure to identify the existence of a local minimum or a flat-spot area. The performance investigation shows that the proposed enhancements can improve the performance of WOM significantly when it is applied into different fast learning algorithms. Moreover, WOM with these enhancements is also applied to a very popular second-order gradient descent learning algorithm, Levenberg-Marquardt (LM) algorithm. The performance investigation shows that it can significantly improve the performance of LM.
UR - http://www.scopus.com/inward/record.url?scp=84908467036&partnerID=8YFLogxK
U2 - 10.1109/IJCNN.2014.6889622
DO - 10.1109/IJCNN.2014.6889622
M3 - Conference contribution
AN - SCOPUS:84908467036
T3 - Proceedings of the International Joint Conference on Neural Networks
SP - 1225
EP - 1230
BT - Proceedings of the International Joint Conference on Neural Networks
T2 - 2014 International Joint Conference on Neural Networks, IJCNN 2014
Y2 - 6 July 2014 through 11 July 2014
ER -