Self-augmentation: Generalizing deep networks to unseen classes for few-shot learning
- Authors
- Seo, J.-W.; Jung, H.-G.; Lee, S.-W.
- Issue Date
- 6월-2021
- Publisher
- Elsevier Ltd
- Keywords
- Classification; Few-shot learning; Generalization; Knowledge distillation
- Citation
- Neural Networks, v.138, pp.140 - 149
- Indexed
- SCIE
SCOPUS
- Journal Title
- Neural Networks
- Volume
- 138
- Start Page
- 140
- End Page
- 149
- URI
- https://scholar.korea.ac.kr/handle/2021.sw.korea/128833
- DOI
- 10.1016/j.neunet.2021.02.007
- ISSN
- 0893-6080
- Abstract
- Few-shot learning aims to classify unseen classes with a few training examples. While recent works have shown that standard mini-batch training with carefully designed training strategies can improve generalization ability for unseen classes, well-known problems in deep networks such as memorizing training statistics have been less explored for few-shot learning. To tackle this issue, we propose self-augmentation that consolidates self-mix and self-distillation. Specifically, we propose a regional dropout technique called self-mix, in which a patch of an image is substituted into other values in the same image. With this dropout effect, we show that the generalization ability of deep networks can be improved as it prevents us from learning specific structures of a dataset. Then, we employ a backbone network that has auxiliary branches with its own classifier to enforce knowledge sharing. This sharing of knowledge forces each branch to learn diverse optimal points during training. Additionally, we present a local representation learner to further exploit a few training examples of unseen classes by generating fake queries and novel weights. Experimental results show that the proposed method outperforms the state-of-the-art methods for prevalent few-shot benchmarks and improves the generalization ability. © 2021 Elsevier Ltd
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - Graduate School > Department of Artificial Intelligence > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.