Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

A survey on parallel training algorithms for deep neural networks

Authors
Yook, DongsukLee, HyowonYoo, In-Chul
Issue Date
2020
Publisher
ACOUSTICAL SOC KOREA
Keywords
Deep Neural Network (DNN); Deep learning; Stochastic Gradient Descent (SGD); Parallel processing
Citation
JOURNAL OF THE ACOUSTICAL SOCIETY OF KOREA, v.39, no.6, pp.505 - 514
Indexed
SCOPUS
KCI
Journal Title
JOURNAL OF THE ACOUSTICAL SOCIETY OF KOREA
Volume
39
Number
6
Start Page
505
End Page
514
URI
https://scholar.korea.ac.kr/handle/2021.sw.korea/59067
DOI
10.7776/ASK.2020.39.6.505
ISSN
1225-4428
Abstract
Since a large amount of training data is typically needed to train Deep Neural Networks (DNNs), a parallel training approach is required to train the DNNs. The Stochastic Gradient Descent (SGD) algorithm is one of the most widely used methods to train the DNNs. However, since the SGD is an inherently sequential process, it requires some sort of approximation schemes to parallelize the SGD algorithm. In this paper, we review various efforts on parallelizing the SGD algorithm, and analyze the computational overhead, communication overhead, and the effects of the approximations.
Files in This Item
There are no files associated with this item.
Appears in
Collections
Graduate School > Department of Computer Science and Engineering > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetrics

Total Views & Downloads

BROWSE