Superimposing Memory by Dynamic and Spatial Changing Synaptic Weights

Description

In this paper a novel neural network model is presented for incremental learning tasks where networks are required to learn new knowledge without forgetting the old one. An essential core of the proposed neural network structure is its dynamic and spatial changing weights (DSCWs). A learning scheme is developed for the formulation of the dynamic changing weights, while a structural adaptation is formulated by the spatial changing (growing) connecting weights. As the new synaptic connections are formed, new network structure is superimposed on the previous structure. In this superimposition, to avoid disturbing the past knowledge due to the creation of new connections, a restoration mechanism is introduced by using the DSCWs. Usefulness of the proposed model is demonstrated by using pattern classification and system identification tasks.

Journal

Details 詳細情報について

  • CRID
    1390282680560574080
  • NII Article ID
    130006959934
  • DOI
    10.11499/sicep.2002.0.677.0
  • Text Lang
    en
  • Data Source
    • JaLC
    • CiNii Articles
  • Abstract License Flag
    Disallowed

Report a problem

Back to top