因此,文章提出了一种名为联邦学习 (Federated Learning) 的新方法,将数据保留在本地设备上进行模型训练,而不是传输数据。只需将模型更新发送到中央服务器进行聚合,从而减少了隐私泄露的风险,这显著降低了通信开销。

精读:https://blog.csdn.net/chrnhao/article/details/142751006?ops_request_misc=%257B%2522request%255Fid%2522%253A%252235dda9f343e0c110dbacb19ffb23c3fa%2522%252C%2522scm%2522%253A%252220140713.130102334.pc%255Fall.%2522%257D&request_id=35dda9f343e0c110dbacb19ffb23c3fa&biz_id=0&utm_medium=distribute.pc_search_result.none-task-blog-2allfirst_rank_ecpm_v1~rank_v31_ecpm-2-142751006-null-null.142v100pc_search_result_base5&utm_term=FedAvg%3A%20Communication-Efficient%20Learning%20of%20Deep%20Networks%20from%20Decentralized%20Data&spm=1018.2226.3001.4187

概括:https://blog.csdn.net/C__King_6/article/details/143382030?ops_request_misc=&request_id=&biz_id=102&utm_term=FedAvg:%20Communication-Efficien&utm_medium=distribute.pc_search_result.none-task-blog-2allsobaiduweb~default-0-143382030.142v100pc_search_result_base5&spm=1018.2226.3001.4187

https://blog.csdn.net/m0_72913514/article/details/141780685