Refined Consistency for Semi-Supervised Learning with Knowledge Distillation

Bibliographic Information

Other Title
  • Refined Consistencyによる知識蒸留を用いた半教師あり学習

Description

<p>Semi-supervised learning is a method that uses both labeled and unlabeled data for training the model. Dual Student (DS), which transfers knowledge between two networks, and Multiple Student (MS), which expands the number of DS networks to four or more, have been proposed as semi-supervised learning. MS achieves higher accuracy than DS, but learning MS is inefficient because knowledge transfer between all networks is not performed at once in the MS learning. In this paper, we propose refined-consistency, which transfers knowledge between all networks at once, to improve accuracy through an efficient knowledge transfer method. In the experiment with the CIFAR-100 dataset, we show that the proposed method improves the accuracy more than MS.</p>

Journal

Details 詳細情報について

  • CRID
    1390006895527458688
  • NII Article ID
    130008051943
  • DOI
    10.11517/pjsai.jsai2021.0_4g2gs2k03
  • ISSN
    27587347
  • Text Lang
    ja
  • Data Source
    • JaLC
    • CiNii Articles
  • Abstract License Flag
    Disallowed

Report a problem

Back to top