非線形方程式の求解によるニューラルネットの学習法

書誌事項

タイトル別名
  • Learning Algorithm for Neural Networks by Solving Nonlinear Equations
  • ヒセンケイ ホウテイシキ ノ キュウカイ ニ ヨル ニューラル ネット ノ ガ

この論文をさがす

抄録

Backpropagation is most widely used learning algorithm for neural networks. It has some well known drawbacks. The most important drawback is the difficulty in the choice of the values of various learning parameters. Inadequate values will result in a very slow convergence. In the most serious case, backpropagation process will often encounter a local minimum and thus been anable to solve a learning problem. These drawbacks stem from that backpropagation is a gradient based optimization procedure without linear search.<br>In this paper, we propose a new learning algorithm, based on the use of solution method for nonlinear equations which represent the output errors. We basically use Newton method for nonlinear equations since it is superior in convergency. However, Newton method, in dependence on the initial point, does not ensure the global convergence. So, we employ the Homotopy continuation method, which is one of the parametric method, to overcome this drawback. The proposed method is tested for various learning problems. The computational results show that the proposed method is superior to backpropagation in convergency.

収録刊行物

詳細情報 詳細情報について

問題の指摘

ページトップへ