Accelerated gradient learning algorithm for neural network weights update

Neural Computing and Applications - Tập 19 - Trang 219-225 - 2009
Željko Hocenski1, Mladen Antunoviæ1, Damir Filko1
1Faculty of Electrical Engineering, University J.J. Strossmayer, Osijek, Croatia

Tóm tắt

This work proposes decomposition of square approximation algorithm for neural network weights update. Suggested improvement results in alternative method that converge in less iteration and is inherently parallel. Decomposition enables parallel execution convenient for implementation on computer grid. Improvements are reflected in accelerated learning rate which may be essential for time critical decision processes. Proposed solution is tested and verified on multilayer perceptrons neural network case study, varying a wide range of parameters, such as number of inputs/outputs, length of input/output data, number of neurons and layers. Experimental results show time savings up to 40% in multiple thread execution.

Tài liệu tham khảo