Learning to Balance Local Losses via Meta-Learning
- Authors
- Yoa, Seungdong; Jeon, Minkyu; Oh, Youngjin; Kim, Hyunwoo J.
- Issue Date
- 2021
- Publisher
- IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
- Keywords
- Deep learning; Deep learning; Licenses; Loss measurement; Neural networks; Standards; Task analysis; Training; image classification; machine learning; meta-learning
- Citation
- IEEE ACCESS, v.9, pp.130834 - 130844
- Indexed
- SCIE
SCOPUS
- Journal Title
- IEEE ACCESS
- Volume
- 9
- Start Page
- 130834
- End Page
- 130844
- URI
- https://scholar.korea.ac.kr/handle/2021.sw.korea/138666
- DOI
- 10.1109/ACCESS.2021.3113934
- ISSN
- 2169-3536
- Abstract
- The standard training for deep neural networks relies on a global and fixed loss function. For more effective training, dynamic loss functions have been recently proposed. However, the dynamic global loss function is not flexible to differentially train layers in complex deep neural networks. In this paper, we propose a general framework that learns to adaptively train each layer of deep neural networks via meta-learning. Our framework leverages the local error signals from layers and identifies which layer needs to be trained more at every iteration. Also, the proposed method improves the local loss function with our minibatch-wise dropout and cross-validation loop to alleviate meta-overfitting. The experiments show that our method achieved competitive performance compared to state-of-the-art methods on popular benchmark datasets for image classification: CIFAR-10 and CIFAR-100. Surprisingly, our method enables training deep neural networks without skip-connections using dynamically weighted local loss functions.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - Graduate School > Department of Computer Science and Engineering > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.