The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
Ces dernières années, l'apprentissage fédéré a attiré de plus en plus d'attention car il permet de former de manière collaborative un modèle global sans collecter les données brutes des utilisateurs. Cela a apporté de nombreux défis. Dans cet article, nous avons proposé un système d'apprentissage fédéré basé sur des couches avec préservation de la confidentialité. Nous avons réussi à réduire le coût de communication en sélectionnant plusieurs couches du modèle à télécharger pour une moyenne globale et à améliorer la protection de la vie privée en appliquant une confidentialité différentielle locale. Nous avons évalué notre système dans un scénario distribué de manière non indépendante et identique sur trois ensembles de données. Par rapport aux travaux existants, notre solution a obtenu de meilleures performances en termes de précision du modèle et de temps de formation.
Zhuotao LIAN
University of Aizu
Weizheng WANG
City University of Hong Kong
Huakun HUANG
Guangzhou University
Chunhua SU
University of Aizu
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copier
Zhuotao LIAN, Weizheng WANG, Huakun HUANG, Chunhua SU, "Layer-Based Communication-Efficient Federated Learning with Privacy Preservation" in IEICE TRANSACTIONS on Information,
vol. E105-D, no. 2, pp. 256-263, February 2022, doi: 10.1587/transinf.2021BCP0006.
Abstract: In recent years, federated learning has attracted more and more attention as it could collaboratively train a global model without gathering the users' raw data. It has brought many challenges. In this paper, we proposed layer-based federated learning system with privacy preservation. We successfully reduced the communication cost by selecting several layers of the model to upload for global averaging and enhanced the privacy protection by applying local differential privacy. We evaluated our system in non independently and identically distributed scenario on three datasets. Compared with existing works, our solution achieved better performance in both model accuracy and training time.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.2021BCP0006/_p
Copier
@ARTICLE{e105-d_2_256,
author={Zhuotao LIAN, Weizheng WANG, Huakun HUANG, Chunhua SU, },
journal={IEICE TRANSACTIONS on Information},
title={Layer-Based Communication-Efficient Federated Learning with Privacy Preservation},
year={2022},
volume={E105-D},
number={2},
pages={256-263},
abstract={In recent years, federated learning has attracted more and more attention as it could collaboratively train a global model without gathering the users' raw data. It has brought many challenges. In this paper, we proposed layer-based federated learning system with privacy preservation. We successfully reduced the communication cost by selecting several layers of the model to upload for global averaging and enhanced the privacy protection by applying local differential privacy. We evaluated our system in non independently and identically distributed scenario on three datasets. Compared with existing works, our solution achieved better performance in both model accuracy and training time.},
keywords={},
doi={10.1587/transinf.2021BCP0006},
ISSN={1745-1361},
month={February},}
Copier
TY - JOUR
TI - Layer-Based Communication-Efficient Federated Learning with Privacy Preservation
T2 - IEICE TRANSACTIONS on Information
SP - 256
EP - 263
AU - Zhuotao LIAN
AU - Weizheng WANG
AU - Huakun HUANG
AU - Chunhua SU
PY - 2022
DO - 10.1587/transinf.2021BCP0006
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E105-D
IS - 2
JA - IEICE TRANSACTIONS on Information
Y1 - February 2022
AB - In recent years, federated learning has attracted more and more attention as it could collaboratively train a global model without gathering the users' raw data. It has brought many challenges. In this paper, we proposed layer-based federated learning system with privacy preservation. We successfully reduced the communication cost by selecting several layers of the model to upload for global averaging and enhanced the privacy protection by applying local differential privacy. We evaluated our system in non independently and identically distributed scenario on three datasets. Compared with existing works, our solution achieved better performance in both model accuracy and training time.
ER -