Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Speculative Backpropagation for CNN Parallel Training

Full metadata record
DC Field Value Language
dc.contributor.authorPark, Sangwoo-
dc.contributor.authorSuh, Taeweon-
dc.date.accessioned2021-08-31T16:16:20Z-
dc.date.available2021-08-31T16:16:20Z-
dc.date.created2021-06-18-
dc.date.issued2020-
dc.identifier.issn2169-3536-
dc.identifier.urihttps://scholar.korea.ac.kr/handle/2021.sw.korea/59077-
dc.description.abstractThe parallel learning in neural networks can greatly shorten the training time. Its prior efforts were mostly limited to distributing inputs to multiple computing engines. It is because the gradient descent algorithm in the neural network training is inherently sequential. This paper proposes a novel CNN parallel training method for image recognition. It overcomes the sequential property of the gradient descent and enables the parallel training with the speculative backpropagation. We found that the Softmax and ReLU outcomes in the forward propagation for the same labels are likely to be very similar. This characteristic makes it possible to perform the forward and backward propagation simultaneously. We implemented the proposed parallel model with CNNs in both software and hardware, and evaluated its performance. The parallel training reduces the training time by 34% in CIFAR-100 without the loss of the prediction accuracy compared to the sequential training. In many cases, it even improves the accuracy.-
dc.languageEnglish-
dc.language.isoen-
dc.publisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC-
dc.titleSpeculative Backpropagation for CNN Parallel Training-
dc.typeArticle-
dc.contributor.affiliatedAuthorSuh, Taeweon-
dc.identifier.doi10.1109/ACCESS.2020.3040849-
dc.identifier.scopusid2-s2.0-85097817217-
dc.identifier.wosid000597968600001-
dc.identifier.bibliographicCitationIEEE ACCESS, v.8, pp.215365 - 215374-
dc.relation.isPartOfIEEE ACCESS-
dc.citation.titleIEEE ACCESS-
dc.citation.volume8-
dc.citation.startPage215365-
dc.citation.endPage215374-
dc.type.rimsART-
dc.type.docTypeArticle-
dc.description.journalClass1-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalResearchAreaEngineering-
dc.relation.journalResearchAreaTelecommunications-
dc.relation.journalWebOfScienceCategoryComputer Science, Information Systems-
dc.relation.journalWebOfScienceCategoryEngineering, Electrical & Electronic-
dc.relation.journalWebOfScienceCategoryTelecommunications-
dc.subject.keywordAuthorTraining-
dc.subject.keywordAuthorBackpropagation-
dc.subject.keywordAuthorNeurons-
dc.subject.keywordAuthorComputational modeling-
dc.subject.keywordAuthorParallel processing-
dc.subject.keywordAuthorHardware-
dc.subject.keywordAuthorBiological neural networks-
dc.subject.keywordAuthorDeep learning-
dc.subject.keywordAuthorparallel training-
dc.subject.keywordAuthorspeculative backpropagation-
dc.subject.keywordAuthortraining accelerator-
dc.subject.keywordAuthorFPGA-
Files in This Item
There are no files associated with this item.
Appears in
Collections
Graduate School > Department of Computer Science and Engineering > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Suh, Tae weon photo

Suh, Tae weon
컴퓨터학과
Read more

Altmetrics

Total Views & Downloads

BROWSE