Memory Superimposition by Backpropagation Neural Networks

書誌事項

タイトル別名
  • 誤差逆伝播ネットワークによる重ね書き記憶

この論文をさがす

抄録

application/pdf

We propose a novel neural network for incremental learning tasks where networks are required to learn new knowledge without forgetting the old one. An essential core of the proposed neural learning structure is a transferring scheme from short-term memory (STM) into long-term memory (LTM) as in brains by using dynamic changing weights. As the number of LTMs increases, a new network structure is superimposed on the previous one without disturbing the past LTMs by introducing a lateral inhibition mechanism. Superiority of the proposed neural structure to the conventional backpropagation networks is proven with respect to the learning ability.

紀要類(bulletin)

671882 bytes

収録刊行物

詳細情報 詳細情報について

問題の指摘

ページトップへ