Abstractive Sentence Summarization with Supervised Copy-Mechanism

DOI

Bibliographic Information

Other Title
  • 教師有りコピー機構を用いた要約文生成

Abstract

<p>The current copy mechanisms are learned as a part of a neural model for text summarization with end-to-end training, and so it is not explicit which words can be outputted by copying. Thus, to copy appropriate words, we propose a method for learning the copy mechanism in a supervised manner by utilizing the estimated results of which source expressions appear in the summarization sentence. Moreover, we verify the effectiveness of the copy mechanism in the Transformer model that has been utilized for text summarization but is not accompanied with it. Our experiments on the headline generation task with automatic evaluation show that the copy mechanism is also effective in the Transformer model, and our proposed supervised copy mechanism can improve the summarization performance in both the LSTM-based model and the Transformer model. In particular, our method significantly improves the ROUGE-1, 2 F-measures in the Transformer model.</p>

Journal

Details 詳細情報について

  • CRID
    1390003825189323776
  • NII Article ID
    130007856933
  • DOI
    10.11517/pjsai.jsai2020.0_2h6gs901
  • Text Lang
    ja
  • Data Source
    • JaLC
    • CiNii Articles
  • Abstract License Flag
    Disallowed

Report a problem

Back to top