Clustered Federated Learning: Model-Agnostic Distributed Multitask Optimization Under Privacy Constraints
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Sattler, Felix | - |
dc.contributor.author | Mueller, Klaus-Robert | - |
dc.contributor.author | Samek, Wojciech | - |
dc.date.accessioned | 2022-02-26T05:40:53Z | - |
dc.date.available | 2022-02-26T05:40:53Z | - |
dc.date.created | 2022-02-07 | - |
dc.date.issued | 2021-08 | - |
dc.identifier.issn | 2162-237X | - |
dc.identifier.uri | https://scholar.korea.ac.kr/handle/2021.sw.korea/136962 | - |
dc.description.abstract | Federated learning (FL) is currently the most widely adopted framework for collaborative training of (deep) machine learning models under privacy constraints. Albeit its popularity, it has been observed that FL yields suboptimal results if the local clients' data distributions diverge. To address this issue, we present clustered FL (CFL), a novel federated multitask learning (FMTL) framework, which exploits geometric properties of the FL loss surface to group the client population into clusters with jointly trainable data distributions. In contrast to existing FMTL approaches, CFL does not require any modifications to the FL communication protocol to be made, is applicable to general nonconvex objectives (in particular, deep neural networks), does not require the number of clusters to be known a priori, and comes with strong mathematical guarantees on the clustering quality. CFL is flexible enough to handle client populations that vary over time and can be implemented in a privacy-preserving way. As clustering is only performed after FL has converged to a stationary point, CFL can be viewed as a postprocessing method that will always achieve greater or equal performance than conventional FL by allowing clients to arrive at more specialized models. We verify our theoretical analysis in experiments with deep convolutional and recurrent neural networks on commonly used FL data sets. | - |
dc.language | English | - |
dc.language.iso | en | - |
dc.publisher | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC | - |
dc.title | Clustered Federated Learning: Model-Agnostic Distributed Multitask Optimization Under Privacy Constraints | - |
dc.type | Article | - |
dc.contributor.affiliatedAuthor | Mueller, Klaus-Robert | - |
dc.identifier.doi | 10.1109/TNNLS.2020.3015958 | - |
dc.identifier.scopusid | 2-s2.0-85111983262 | - |
dc.identifier.wosid | 000681169500040 | - |
dc.identifier.bibliographicCitation | IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, v.32, no.8, pp.3710 - 3722 | - |
dc.relation.isPartOf | IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS | - |
dc.citation.title | IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS | - |
dc.citation.volume | 32 | - |
dc.citation.number | 8 | - |
dc.citation.startPage | 3710 | - |
dc.citation.endPage | 3722 | - |
dc.type.rims | ART | - |
dc.type.docType | Article | - |
dc.description.journalClass | 1 | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Computer Science | - |
dc.relation.journalResearchArea | Engineering | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Artificial Intelligence | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Hardware & Architecture | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Theory & Methods | - |
dc.relation.journalWebOfScienceCategory | Engineering, Electrical & Electronic | - |
dc.subject.keywordAuthor | Clustering | - |
dc.subject.keywordAuthor | Data models | - |
dc.subject.keywordAuthor | Optimization | - |
dc.subject.keywordAuthor | Privacy | - |
dc.subject.keywordAuthor | Servers | - |
dc.subject.keywordAuthor | Sociology | - |
dc.subject.keywordAuthor | Statistics | - |
dc.subject.keywordAuthor | Training | - |
dc.subject.keywordAuthor | distributed learning | - |
dc.subject.keywordAuthor | federated learning | - |
dc.subject.keywordAuthor | multi-task learning | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
(02841) 서울특별시 성북구 안암로 14502-3290-1114
COPYRIGHT © 2021 Korea University. All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.