機械知覚&ロボティクスグループ
中部大学

Deep Learning 国際会議

Refining Design Spaces in Knowledge Distillation for Deep Collaborative Learning

Author
Sachi Iwata, Soma Minami, Tsubasa Hirakawa, Takayoshi Yamashita, Hironobu Fujiyoshi
Publication
International Conference on Pattern Recognition, 2022

Download: PDF (English)

Knowledge distillation is one of the most widely utilized methods to improve the performance of a model. The knowledge transfer graph has been proposed for deep collaborative learning that enables a rich diversity of bidirectional knowledge distillation. However, exploring a knowledge transfer graph is difficult due to the many potential combinations it can have, so it is not clear how accurate the resultant graphs will actually be. To address this issue, we propose a method for designing the search space with step by step and analyze the trends of graphs to design graphs with high accuracy on the basis of the acquired results. Experiments on the CIFAR-100 dataset show that we confirm that the accuracy of the best knowledge transfer graph in the search space is better than that derived using the asynchronous successive halving algorithm. We also demonstrate that the explored knowledge transfer graphs can be transferred to different datasets.

前の研究 次の研究