Refined Consistency for Semi-Supervised Learning with Knowledge Distillation
-
- MURAMOTO Yoshitaka
- Chubu University
-
- OKAMOTO Naoki
- Chubu University
-
- HIRAKAWA Tubasa
- Chubu University
-
- YAMASHITA Takayoshi
- Chubu University
-
- FUJIYOSHI Hironobu
- Chubu University
Bibliographic Information
- Other Title
-
- Refined Consistencyによる知識蒸留を用いた半教師あり学習
Description
<p>Semi-supervised learning is a method that uses both labeled and unlabeled data for training the model. Dual Student (DS), which transfers knowledge between two networks, and Multiple Student (MS), which expands the number of DS networks to four or more, have been proposed as semi-supervised learning. MS achieves higher accuracy than DS, but learning MS is inefficient because knowledge transfer between all networks is not performed at once in the MS learning. In this paper, we propose refined-consistency, which transfers knowledge between all networks at once, to improve accuracy through an efficient knowledge transfer method. In the experiment with the CIFAR-100 dataset, we show that the proposed method improves the accuracy more than MS.</p>
Journal
-
- Proceedings of the Annual Conference of JSAI
-
Proceedings of the Annual Conference of JSAI JSAI2021 (0), 4G2GS2k03-4G2GS2k03, 2021
The Japanese Society for Artificial Intelligence
- Tweet
Keywords
Details 詳細情報について
-
- CRID
- 1390006895527458688
-
- NII Article ID
- 130008051943
-
- ISSN
- 27587347
-
- Text Lang
- ja
-
- Data Source
-
- JaLC
- CiNii Articles
-
- Abstract License Flag
- Disallowed