- 【Updated on May 12, 2025】 Integration of CiNii Dissertations and CiNii Books into CiNii Research
- Trial version of CiNii Research Automatic Translation feature is available on CiNii Labs
- Suspension and deletion of data provided by Nikkei BP
- Regarding the recording of “Research Data” and “Evidence Data”
Data-Free Network Pruning for Model Compression
Description
Convolutional neural networks(CNNs) are often over-parameterized and cannot apply to existing resource-limited artificial intelligence(AI) devices. Some methods are proposed to model compress the CNNs, but these methods are data-driven and often unable when lacking data. To solve this problem, in this paper, we propose a data-free model compression and acceleration method based on generative adversarial networks and network pruning(named DFNP), which can train a compact neural network only needs a pre-trained neural network. The DFNP consists of the source network, generator, and target network. First, the generator will generate the pseudo data under the supervise of the source network. Then the target network will get by pruning the source network and use these generated data for training. And the source network will transfer knowledge to the target network to promote the target network to achieve a similar performance of the source network. When the VGGNet- 19 is select as the source network, the target network trained by DFNP contains only 25% parameters and 65% calculations of the source network. Still, it retains 99.4% accuracy on the CIFAR-10 dataset without any real data.
Journal
-
- 2021 IEEE International Symposium on Circuits and Systems (ISCAS)
-
2021 IEEE International Symposium on Circuits and Systems (ISCAS) 1-5, 2021-05-01
IEEE