Improved kick out learning algorithm with delta-bar-delta-bar rule
説明
A new adaptive rule is proposed. It is called the delta-bar-delta-bar rule. It improves robustness for settling the increment and decrement factors of the learning rate of an accelerated backpropagation algorithm. This rule is introduced into the kick out algorithm, and it is shown that it is effective in extracting the best performance of the kick out algorithm. Using the delta-bar-delta-bar rule, the rate of convergence of the kick out algorithm is substantially improved, even though the learning rates are not optimally set. >
収録刊行物
-
- IEEE International Conference on Neural Networks
-
IEEE International Conference on Neural Networks 269-274, 2002-12-30
IEEE