Superimposing Memory by Dynamic and Spatial Changing Synaptic Weights
-
- Homma Noriyasu
- Tohoku University
-
- Gupta Madan M.
- University of Saskatchewan
-
- Abe Kenichi
- Tohoku University
-
- Takeda Hiroshi
- Tohoku Gakuin University
説明
In this paper a novel neural network model is presented for incremental learning tasks where networks are required to learn new knowledge without forgetting the old one. An essential core of the proposed neural network structure is its dynamic and spatial changing weights (DSCWs). A learning scheme is developed for the formulation of the dynamic changing weights, while a structural adaptation is formulated by the spatial changing (growing) connecting weights. As the new synaptic connections are formed, new network structure is superimposed on the previous structure. In this superimposition, to avoid disturbing the past knowledge due to the creation of new connections, a restoration mechanism is introduced by using the DSCWs. Usefulness of the proposed model is demonstrated by using pattern classification and system identification tasks.
収録刊行物
-
- SICE Annual Conference Program and Abstracts
-
SICE Annual Conference Program and Abstracts 2002 (0), 677-677, 2002
公益社団法人 計測自動制御学会
- Tweet
詳細情報 詳細情報について
-
- CRID
- 1390282680560574080
-
- NII論文ID
- 130006959934
-
- 本文言語コード
- en
-
- データソース種別
-
- JaLC
- CiNii Articles
-
- 抄録ライセンスフラグ
- 使用不可