The Return of Lexical Dependencies: Neural Lexicalized PCFGs

  • Hao Zhu
    Language Technologies Institute, Carnegie Mellon University.
  • Yonatan Bisk
    Language Technologies Institute, Carnegie Mellon University.
  • Graham Neubig
    Language Technologies Institute, Carnegie Mellon University.

抄録

<jats:p>In this paper we demonstrate that context free grammar (CFG) based methods for grammar induction benefit from modeling lexical dependencies. This contrasts to the most popular current methods for grammar induction, which focus on discovering either constituents or dependencies. Previous approaches to marry these two disparate syntactic formalisms (e.g., lexicalized PCFGs) have been plagued by sparsity, making them unsuitable for unsupervised grammar induction. However, in this work, we present novel neural models of lexicalized PCFGs that allow us to overcome sparsity problems and effectively induce both constituents and dependencies within a single model. Experiments demonstrate that this unified framework results in stronger results on both representations than achieved when modeling either formalism alone.<jats:sup>1</jats:sup></jats:p>

収録刊行物

被引用文献 (1)*注記

もっと見る

詳細情報 詳細情報について

問題の指摘

ページトップへ