Bridging the gap between complex scientific research and the curious minds eager to explore it.

Computer Science, Computer Vision and Pattern Recognition

Compressing GANs with Discriminator-Free Distillation

Compressing GANs with Discriminator-Free Distillation

In this paper, the authors propose a novel approach to compress Generative Adversarial Networks (GANs) while maintaining their performance. They introduce a pruning method that reduces the weight and computational complexity of the generator and discriminator, making them more suitable for deployment on edge devices. The proposed method, called Pruned GANs (PGANs), consists of two stages:

  1. Pruning Stage: The authors employ a trained pruning agent to determine the optimal sub-structure of the generator and discriminator. This stage reduces the number of weights and computations required by the models.
  2. Fine-tuning Stage: The authors use the original loss functions for Pix2Pix (Isola et al., 2017) and CycleGAN (Zhu et al., 2017) to fine-tune the pruned models with Adam optimizer (Kingma and Ba, 2015). This stage refines the models’ performance while saving computational costs.
    The authors demonstrate the effectiveness of PGANs on several benchmark datasets, achieving comparable or better results than state-of-the-art methods while using significantly fewer computations. They also show that their approach can be used to compress other types of neural networks, such as convolutional neural networks (CNNs), by applying similar pruning and fine-tuning techniques.
    In summary, PGANs offer a promising solution for efficient inference on edge devices by reducing the computational complexity of GANs without compromising their performance. By employing a trained pruning agent and fine-tuning the pruned models, the authors are able to compress GANs while maintaining their ability to generate high-quality images. This work has significant implications for deploying AI models on resource-constrained devices, making them more accessible and practical for everyday use.