Federated learning on non-IID and long-tailed data via dual-decoupling
Regular Papers|Updated:2024-06-03
|
Federated learning on non-IID and long-tailed data via dual-decoupling
Enhanced Publication
基于非独立同分布和长尾数据的双解耦联邦学习
“In the realm of distributed machine learning, a novel solution called Federated Dual-Decoupling via Model and Logit Calibration (FedDDC) has been introduced to address the challenges of non-IID and long-tailed distributions in federated learning. This approach, characterized by decoupling the global model, client confidence re-weighting, classifier re-balancing, and decoupled knowledge distillation, significantly enhances the accuracy of the global model on non-IID and long-tailed data, outperforming existing state-of-the-art methods.”
Frontiers of Information Technology & Electronic EngineeringVol. 25, Issue 5, Pages: 728-741(2024)
Affiliations:
College of Computer Science and Technology, Shanghai University of Electric Power, Shanghai 201306, China
ZHAOHUI WANG, HONGJIAO LI, JINGUO LI, et al. Federated learning on non-IID and long-tailed data via dual-decoupling. [J]. Frontiers of information technology & electronic engineering, 2024, 25(5): 728-741.
DOI:
ZHAOHUI WANG, HONGJIAO LI, JINGUO LI, et al. Federated learning on non-IID and long-tailed data via dual-decoupling. [J]. Frontiers of information technology & electronic engineering, 2024, 25(5): 728-741. DOI: 10.1631/FITEE.2300284.
Federated learning on non-IID and long-tailed data via dual-decouplingEnhanced Publication