Superimposing Memory by Dynamic and Spatial Changing Synaptic Weights

説明

In this paper a novel neural network model is presented for incremental learning tasks where networks are required to learn new knowledge without forgetting the old one. An essential core of the proposed neural network structure is its dynamic and spatial changing weights (DSCWs). A learning scheme is developed for the formulation of the dynamic changing weights, while a structural adaptation is formulated by the spatial changing (growing) connecting weights. As the new synaptic connections are formed, new network structure is superimposed on the previous structure. In this superimposition, to avoid disturbing the past knowledge due to the creation of new connections, a restoration mechanism is introduced by using the DSCWs. Usefulness of the proposed model is demonstrated by using pattern classification and system identification tasks.

収録刊行物

詳細情報 詳細情報について

  • CRID
    1390282680560574080
  • NII論文ID
    130006959934
  • DOI
    10.11499/sicep.2002.0.677.0
  • 本文言語コード
    en
  • データソース種別
    • JaLC
    • CiNii Articles
  • 抄録ライセンスフラグ
    使用不可

問題の指摘

ページトップへ