Dialogue over context and structured knowledge using a neural network model with an external memory

DOI

Bibliographic Information

Other Title
  • 記憶装置付きニューラルネットワークモデルによる文脈と構造化知識を用いた対話

Abstract

<p>In recent years, sequence to sequence models such as Seq2Seq and Transformer have been commonplace for dialogue architectures. On the other hand, it is necessary for more natural and intellectual dialogue to understand context and use knowledge. However, scientists have argued that such models are limited in their ability to store data over a long time. To retain the long-term information, neural network models with external memories such as End-To-End Memory Networks and Differentiable Neural Computer (DNC) have been proposed. In this work, we extend DNC architectures and propose a model using both context and structured knowledge. We conducted an experiment on a dataset which is composed of a series of coherently linked questions that require a large scale knowledge graph and their answers. The mean test error rate was 69.25% after 20k iterations and a little higher than the original DNC's error rate 69.09%.</p>

Journal

Details 詳細情報について

  • CRID
    1390848250119671808
  • NII Article ID
    130007857171
  • DOI
    10.11517/pjsai.jsai2020.0_3q1gs902
  • Text Lang
    ja
  • Data Source
    • JaLC
    • CiNii Articles
  • Abstract License Flag
    Disallowed

Report a problem

Back to top