The Back Propagation Method Using the Least Mean-Square Method for the Output Recurrent Neural Network
-
- YAMAWAKI Shigenobu
- Faculty of Science and Technology, Kinki University
-
- FUJINO Masashi
- NISSIN SYSTEMS CO. LTD.
-
- IMAO Syozo
- Faculty of Science and Technology, Kinki University
Bibliographic Information
- Other Title
-
- 最小2乗法を適用した回帰型ニューラルネットワークの誤差逆伝播法
- サイショウ 2 ジョウホウ オ テキヨウ シタ カイキガタ ニューラル ネットワーク ノ ゴサ ギャクデンパホウ
Search this article
Description
The back propagation method on the basis of the gradient method is often utilized as a learning rule of a neural network. This paper proposes a back propagation method using the least mean-square method for the output recurrent neural network. The approach consists of the decision of the input vector and the parameter estimation of each layer. The input vector of the output layer is corrected to decrease the output error corresponding to learning rate and the learning value of the other layer. The parameter is calculated using the least-square method from the obtained input and output of each layer.<BR>The identification result for the linear oscillation system shows the effectiveness of the proposed algorithm which is not based on the gradient method. It is shown that better estimate is obtained by the proposed algorithm compared with the classical back propagation method.
Journal
-
- Transactions of the Institute of Systems, Control and Information Engineers
-
Transactions of the Institute of Systems, Control and Information Engineers 12 (4), 225-233, 1999
THE INSTITUTE OF SYSTEMS, CONTROL AND INFORMATION ENGINEERS (ISCIE)
- Tweet
Keywords
Details 詳細情報について
-
- CRID
- 1390282680142293248
-
- NII Article ID
- 10004333992
-
- NII Book ID
- AN1013280X
-
- ISSN
- 2185811X
- 13425668
-
- NDL BIB ID
- 4692516
-
- Data Source
-
- JaLC
- NDL Search
- Crossref
- CiNii Articles
-
- Abstract License Flag
- Disallowed