Hideki Tsunashima, Hirokatsu Kataoka, Junji Yamato, Qiu Chen and Shigeo Morishima

Adversarial Knowledge Distillation for a Compact Generator

25th International Conference on Pattern Recognition (ICPR2020)

https://www.micc.unifi.it/icpr2020/

In this paper, we propose memory-efficient Generative Adversarial Nets (GANs) in line with knowledge distillation. Most existing GANs have a shortcoming in terms of the number of model parameters and low processing speed download md5. Here, to tackle the problem, we propose Adversarial Knowledge Distillation for Generative models (AKDG) for highly efficient GANs, in terms of unconditional generation Download Youtuber's Life. Using AKDG, model size and processing speed are substantively reduced. Through an adversarial training exercise with a distillation discriminator, a student generator successfully mimics a teacher generator in fewer model layers and fewer parameters and at a higher processing speed 윈도우10 수동. Moreover, our AKDG is network architecture-agnostic. A Comparison of AKDG-applied models to vanilla models suggests that it achieves closer scores to a teacher generator and more efficient performance than a baseline method with respect to Inception Score (IS) and Frechet Inception Distance (FID) 착한티비. In CIFAR-10 experiments, improving IS/FID 1.17pt/55.19pt and in LSUN bedroom experiments, improving FID 71.1pt in comparison to the conventional distillation method for GANs Download Sword Tostrengthen. Our project page is https://maguro27.github.io/AKDG/