Modulating Regularization Frequency for Efficient Compression-Aware Model Training

Bibliographic Details
Title: Modulating Regularization Frequency for Efficient Compression-Aware Model Training
Authors: Lee, Dongsoo, Kwon, Se Jung, Kim, Byeongwook, Yun, Jeongin, Park, Baeseong, Jeon, Yongkweon
Publication Year: 2021
Collection: Computer Science
Subject Terms: Computer Science - Machine Learning, Computer Science - Artificial Intelligence
More Details: While model compression is increasingly important because of large neural network size, compression-aware training is challenging as it needs sophisticated model modifications and longer training time.In this paper, we introduce regularization frequency (i.e., how often compression is performed during training) as a new regularization technique for a practical and efficient compression-aware training method. For various regularization techniques, such as weight decay and dropout, optimizing the regularization strength is crucial to improve generalization in Deep Neural Networks (DNNs). While model compression also demands the right amount of regularization, the regularization strength incurred by model compression has been controlled only by compression ratio. Throughout various experiments, we show that regularization frequency critically affects the regularization strength of model compression. Combining regularization frequency and compression ratio, the amount of weight updates by model compression per mini-batch can be optimized to achieve the best model accuracy. Modulating regularization frequency is implemented by occasional model compression while conventional compression-aware training is usually performed for every mini-batch.
Comment: arXiv admin note: text overlap with arXiv:1905.10145
Document Type: Working Paper
Access URL: http://arxiv.org/abs/2105.01875
Accession Number: edsarx.2105.01875
Database: arXiv
More Details
Description not available.