- 【Updated on May 12, 2025】 Integration of CiNii Dissertations and CiNii Books into CiNii Research
- Trial version of CiNii Research Knowledge Graph Search feature is available on CiNii Labs
- Suspension and deletion of data provided by Nikkei BP
- Regarding the recording of “Research Data” and “Evidence Data”
Evaluation of Instruction Tuning on Finance-Specific Large Language Models
-
- YAMADA Masatsugu
- Japan Digital Design Inc.
-
- IMOTO Toshiya
- Japan Digital Design Inc.
Bibliographic Information
- Other Title
-
- 金融ドメイン特化のための大規模言語モデルのインストラクションチューニング評価
Description
<p>It is beginning to be reported that small language models specialized for specific domains exceed the performance of general-purpose large language models. However, open-source language models specialized for the financial domain are limited, and language models need more evaluation with sufficient performance. Therefore, in this paper, we used benchmark sets containing various financial domain tasks such as sentiment analysis, classification, and question answering, and evaluated the performance change of a small chat model when subjected to multiple conditions of instructional tuning. We trained 7B and 13B models for this task by fine-tuning using low-rank adaptation. We empirically found that each model tended to improve the performance with both continuous pre-training and supervised fine-tuning despite over-fitting, and the generated results were affected by the instruction template.</p>
Journal
-
- Proceedings of the Annual Conference of JSAI
-
Proceedings of the Annual Conference of JSAI JSAI2024 (0), 3Xin253-3Xin253, 2024
The Japanese Society for Artificial Intelligence
- Tweet
Details 詳細情報について
-
- CRID
- 1390018971042399744
-
- ISSN
- 27587347
-
- Text Lang
- ja
-
- Data Source
-
- JaLC
-
- Abstract License Flag
- Disallowed