Contextualized Multi-Sense Word Embedding
-
- Ashihara Kazuki
- Graduate School of Information Science and Technology, Osaka University
-
- Kajiwara Tomoyuki
- Institute for Datability Science, Osaka University
-
- Arase Yuki
- Graduate School of Information Science and Technology, Osaka University
-
- Uchida Satoru
- Faculty of Languages and Cultures, Kyushu University
Bibliographic Information
- Other Title
-
- 多義語分散表現の文脈化
- タギゴ ブンサン ヒョウゲン ノ ブンミャクカ
Search this article
Abstract
<p>Currently, distributed word representations are employed in many natural language processing tasks. However, when generating one representation for each word, the meanings of a polysemous word cannot be differentiated because the meanings are integrated into one representation. Therefore, several attempts have been made to generate different representations per meaning based on parts of speech or the topic of a sentence. However, these methods are too unrefined to deal with polysemy. In this paper, we proposed two methods to generate more subtle multiple word representations. The first method involves generating multiple word representations using the word in a dependency relationship as a clue. The second approach involves employing a bi-directional language model in which a word representation that considers all the words in the context is generated. The results of the extensive evaluation of the Lexical Substitution task and Context-Aware Word Similarity task confirmed the effectiveness of our approaches to generate more subtle multiple word representations. </p>
Journal
-
- Journal of Natural Language Processing
-
Journal of Natural Language Processing 26 (4), 689-710, 2019-12-15
The Association for Natural Language Processing
- Tweet
Details 詳細情報について
-
- CRID
- 1390846609812016000
-
- NII Article ID
- 130007808661
-
- NII Book ID
- AN10472659
-
- ISSN
- 21858314
- 13407619
-
- NDL BIB ID
- 030141368
-
- Text Lang
- ja
-
- Data Source
-
- JaLC
- NDL
- Crossref
- CiNii Articles
-
- Abstract License Flag
- Disallowed