Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Robust and Communication-Efficient Federated Learning From Non-i.i.d. Data

Full metadata record
DC Field Value Language
dc.contributor.authorSattler, Felix-
dc.contributor.authorWiedemann, Simon-
dc.contributor.authorMueller, Klaus-Robert-
dc.contributor.authorSamek, Wojciech-
dc.date.accessioned2021-08-30T15:44:17Z-
dc.date.available2021-08-30T15:44:17Z-
dc.date.created2021-06-18-
dc.date.issued2020-09-
dc.identifier.issn2162-237X-
dc.identifier.urihttps://scholar.korea.ac.kr/handle/2021.sw.korea/53658-
dc.description.abstractFederated learning allows multiple parties to jointly train a deep learning model on their combined data, without any of the participants having to reveal their local data to a centralized server. This form of privacy-preserving collaborative learning, however, comes at the cost of a significant communication overhead during training. To address this problem, several compression methods have been proposed in the distributed training literature that can reduce the amount of required communication by up to three orders of magnitude. These existing methods, however, are only of limited utility in the federated learning setting, as they either only compress the upstream communication from the clients to the server (leaving the downstream communication uncompressed) or only perform well under idealized conditions, such as i.i.d. distribution of the client data, which typically cannot be found in federated learning. In this article, we propose sparse ternary compression (STC), a new compression framework that is specifically designed to meet the requirements of the federated learning environment. STC extends the existing compression technique of top-k gradient sparsification with a novel mechanism to enable downstream compression as well as ternarization and optimal Golomb encoding of the weight updates. Our experiments on four different learning tasks demonstrate that STC distinctively outperforms federated averaging in common federated learning scenarios. These results advocate for a paradigm shift in federated optimization toward high-frequency low-bitwidth communication, in particular in the bandwidth-constrained learning environments.-
dc.languageEnglish-
dc.language.isoen-
dc.publisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC-
dc.titleRobust and Communication-Efficient Federated Learning From Non-i.i.d. Data-
dc.typeArticle-
dc.contributor.affiliatedAuthorMueller, Klaus-Robert-
dc.identifier.doi10.1109/TNNLS.2019.2944481-
dc.identifier.wosid000566342500021-
dc.identifier.bibliographicCitationIEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, v.31, no.9, pp.3400 - 3413-
dc.relation.isPartOfIEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS-
dc.citation.titleIEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS-
dc.citation.volume31-
dc.citation.number9-
dc.citation.startPage3400-
dc.citation.endPage3413-
dc.type.rimsART-
dc.type.docTypeArticle-
dc.description.journalClass1-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalResearchAreaEngineering-
dc.relation.journalWebOfScienceCategoryComputer Science, Artificial Intelligence-
dc.relation.journalWebOfScienceCategoryComputer Science, Hardware & Architecture-
dc.relation.journalWebOfScienceCategoryComputer Science, Theory & Methods-
dc.relation.journalWebOfScienceCategoryEngineering, Electrical & Electronic-
dc.subject.keywordAuthorTraining-
dc.subject.keywordAuthorData models-
dc.subject.keywordAuthorServers-
dc.subject.keywordAuthorDeep learning-
dc.subject.keywordAuthorProtocols-
dc.subject.keywordAuthorTraining data-
dc.subject.keywordAuthorDistributed databases-
dc.subject.keywordAuthorDeep learning-
dc.subject.keywordAuthordistributed learning-
dc.subject.keywordAuthorefficient communication-
dc.subject.keywordAuthorfederated learning-
dc.subject.keywordAuthorprivacy-preserving machine learning-
Files in This Item
There are no files associated with this item.
Appears in
Collections
Graduate School > Department of Artificial Intelligence > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetrics

Total Views & Downloads

BROWSE