最小2乗法を適用した回帰型ニューラルネットワークの誤差逆伝播法

書誌事項

タイトル別名
  • The Back Propagation Method Using the Least Mean-Square Method for the Output Recurrent Neural Network
  • サイショウ 2 ジョウホウ オ テキヨウ シタ カイキガタ ニューラル ネットワーク ノ ゴサ ギャクデンパホウ

この論文をさがす

説明

The back propagation method on the basis of the gradient method is often utilized as a learning rule of a neural network. This paper proposes a back propagation method using the least mean-square method for the output recurrent neural network. The approach consists of the decision of the input vector and the parameter estimation of each layer. The input vector of the output layer is corrected to decrease the output error corresponding to learning rate and the learning value of the other layer. The parameter is calculated using the least-square method from the obtained input and output of each layer.<BR>The identification result for the linear oscillation system shows the effectiveness of the proposed algorithm which is not based on the gradient method. It is shown that better estimate is obtained by the proposed algorithm compared with the classical back propagation method.

収録刊行物

参考文献 (7)*注記

もっと見る

詳細情報 詳細情報について

問題の指摘

ページトップへ