Decoupled Training for Long-Tailed Classification With Stochastic Representations

Bibliographic Details
Title: Decoupled Training for Long-Tailed Classification With Stochastic Representations
Authors: Nam, Giung, Jang, Sunguk, Lee, Juho
Publication Year: 2023
Collection: Computer Science
Subject Terms: Computer Science - Machine Learning, Computer Science - Computer Vision and Pattern Recognition
More Details: Decoupling representation learning and classifier learning has been shown to be effective in classification with long-tailed data. There are two main ingredients in constructing a decoupled learning scheme; 1) how to train the feature extractor for representation learning so that it provides generalizable representations and 2) how to re-train the classifier that constructs proper decision boundaries by handling class imbalances in long-tailed data. In this work, we first apply Stochastic Weight Averaging (SWA), an optimization technique for improving the generalization of deep neural networks, to obtain better generalizing feature extractors for long-tailed classification. We then propose a novel classifier re-training algorithm based on stochastic representation obtained from the SWA-Gaussian, a Gaussian perturbed SWA, and a self-distillation strategy that can harness the diverse stochastic representations based on uncertainty estimates to build more robust classifiers. Extensive experiments on CIFAR10/100-LT, ImageNet-LT, and iNaturalist-2018 benchmarks show that our proposed method improves upon previous methods both in terms of prediction accuracy and uncertainty estimation.
Comment: ICLR 2023
Document Type: Working Paper
Access URL: http://arxiv.org/abs/2304.09426
Accession Number: edsarx.2304.09426
Database: arXiv
More Details
Description not available.