Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Graph Transformer Networks: Learning meta-path graphs to improve GNNsopen access

Authors
Yun, SeongjunJeong, MinbyulYoo, SungdongLee, SeunghunYi, Sean S.Kim, RaehyunKang, JaewooKim, Hyunwoo J.
Issue Date
9월-2022
Publisher
PERGAMON-ELSEVIER SCIENCE LTD
Keywords
GraphNeuralNetworks; Heterogeneousgraphs; Machinelearningongraphs; Networkanalysis
Citation
NEURAL NETWORKS, v.153, pp.104 - 119
Indexed
SCIE
SCOPUS
Journal Title
NEURAL NETWORKS
Volume
153
Start Page
104
End Page
119
URI
https://scholar.korea.ac.kr/handle/2021.sw.korea/142726
DOI
10.1016/j.neunet.2022.05.026
ISSN
0893-6080
Abstract
Graph Neural Networks (GNNs) have been widely applied to various fields due to their powerful representations of graph-structured data. Despite the success of GNNs, most existing GNNs are designed to learn node representations on the fixed and homogeneous graphs. The limitations especially become problematic when learning representations on a misspecified graph or a heterogeneous graph that consists of various types of nodes and edges. To address these limitations, we propose Graph Transformer Networks (GTNs) that are capable of generating new graph structures, which preclude noisy connections and include useful connections (e.g., meta-paths) for tasks, while learning effective node representations on the new graphs in an end-to-end fashion. We further propose enhanced version of GTNs, Fast Graph Transformer Networks (FastGTNs), that improve scalability of graph transformations. Compared to GTNs, FastGTNs are up to 230x and 150x faster in inference and training, and use up to 100x and 148x less memory while allowing the identical graph transfor-mations as GTNs. In addition, we extend graph transformations to the semantic proximity of nodes allowing non-local operations beyond meta-paths. Extensive experiments on both homogeneous graphs and heterogeneous graphs show that GTNs and FastGTNs with non-local operations achieve the state-of-the-art performance for node classification tasks. The code is available: https://github.com/ seongjunyun/Graph_Transformer_Networks. (C) 2022 The Authors. Published by Elsevier Ltd.
Files in This Item
There are no files associated with this item.
Appears in
Collections
Graduate School > Department of Computer Science and Engineering > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetrics

Total Views & Downloads

BROWSE