書誌事項
- タイトル別名
-
- Analyzing Transformers via Value Matrices
説明
<p>We propose a new method to analyze Transformer language models. In Transformer self-attention modules, attention weights are calculated from the query vectors and key vectors. Then, output vectors are obtained by taking the weighted sum of value vectors. While existing works on analysis of Transformer have focused on attention weights, this work focused on value and output matrices. We obtain joint matrices by multiplying both matrices, and show that the trace of the joint matrices are correlated with word co-occurences.</p>
収録刊行物
-
- 人工知能学会論文誌
-
人工知能学会論文誌 38 (2), n/a-, 2023-03-01
一般社団法人 人工知能学会
- Tweet
詳細情報 詳細情報について
-
- CRID
- 1390013795251121280
-
- ISSN
- 13468030
- 13460714
-
- 本文言語コード
- ja
-
- データソース種別
-
- JaLC
- Crossref
- KAKEN
- OpenAIRE
-
- 抄録ライセンスフラグ
- 使用不可