A NEW STOCHASTIC LEARNING ALGORITHM FOR NEURAL NETWORKS
-
- Koda Masato
- Institute of Policy and Planning Sciences, University of Tsukuba
-
- Okano Hiroyuki
- IBM Research, Tokyo Research Laboratory
Bibliographic Information
- Other Title
-
- New Stochastic Learning Algorithm for Neural Networks
Search this article
Abstract
A new stochastic learning algorithm using Gaussian white noise sequence, referred to as Subconscious Noise Reaction (SNR), is proposed for a class of discrete-time neural networks with time-dependent connection weights. Unlike the back-propagation-through-time (BTT) algorithm, SNR does not require the synchronous transmission of information backward along connection weights, while it uses only ubiquitous noise and local signals, which are correlated against a single performance functional, to achieve simple sequential (chronologically ordered) updating of connection weights. The algorithm is derived and analyzed on the basis of a functional derivative formulation of the gradient descent method in conjunction with stochastic sensitivity analysis techniques using the variational approach.
Journal
-
- Journal of the Operations Research Society of Japan
-
Journal of the Operations Research Society of Japan 43 (4), 469-485, 2000
The Operations Research Society of Japan
- Tweet
Details 詳細情報について
-
- CRID
- 1390001204108639488
-
- NII Article ID
- 110001183929
-
- NII Book ID
- AA00703935
-
- ISSN
- 21888299
- 04534514
-
- NDL BIB ID
- 5599947
-
- Text Lang
- en
-
- Data Source
-
- JaLC
- NDL
- Crossref
- CiNii Articles
-
- Abstract License Flag
- Disallowed