Learning Augmentation for GNNs With Consistency Regularization
- Authors
- Park, Hyeonjin; Lee, Seunghun; Hwang, Dasol; Jeong, Jisu; Kim, Kyung-Min; Ha, Jung-Woo; Kim, Hyunwoo J.
- Issue Date
- 2021
- Publisher
- IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
- Keywords
- Task analysis; Graph neural networks; Topology; Training; Data models; Licenses; Training data; Graph neural networks; augmentation; semi-supervised learning; meta-learning
- Citation
- IEEE ACCESS, v.9, pp.127961 - 127972
- Indexed
- SCIE
SCOPUS
- Journal Title
- IEEE ACCESS
- Volume
- 9
- Start Page
- 127961
- End Page
- 127972
- URI
- https://scholar.korea.ac.kr/handle/2021.sw.korea/138680
- DOI
- 10.1109/ACCESS.2021.3111908
- ISSN
- 2169-3536
- Abstract
- Graph neural networks (GNNs) have demonstrated superior performance in various tasks on graphs. However, existing GNNs often suffer from weak-generalization due to sparsely labeled datasets. Here we propose a novel framework that learns to augment the input features using topological information and automatically controls the strength of augmentation. Our framework learns the augmentor to minimize GNNs' loss on unseen labeled data while maximizing the consistency of GNNs' predictions on unlabeled data. This can be formulated as a meta-learning problem and our framework alternately optimizes the augmentor and GNNs for a target task. Our extensive experiments demonstrate that the proposed framework is applicable to any GNNs and significantly improves the performance of graph neural networks on node classification. In particular, our method provides 5.78% improvement with Graph convolutional network (GCN) on average across five benchmark datasets.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - Graduate School > Department of Computer Science and Engineering > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.