Privacy-preserving Decentralized Federated Learning over Time-varying Communication Graph

Lu, Yang and Yu, Zhengxin and Suri, Neeraj (2022) Privacy-preserving Decentralized Federated Learning over Time-varying Communication Graph. arXiv, abs/22. ISSN 2331-8422

Full text not available from this repository.

Abstract

Establishing how a set of learners can provide privacy-preserving federated learning in a fully decentralized (peer-to-peer, no coordinator) manner is an open problem. We propose the first privacy-preserving consensus-based algorithm for the distributed learners to achieve decentralized global model aggregation in an environment of high mobility, where the communication graph between the learners may vary between successive rounds of model aggregation. In particular, in each round of global model aggregation, the Metropolis-Hastings method is applied to update the weighted adjacency matrix based on the current communication topology. In addition, the Shamir's secret sharing scheme is integrated to facilitate privacy in reaching consensus of the global model. The paper establishes the correctness and privacy properties of the proposed algorithm. The computational efficiency is evaluated by a simulation built on a federated learning framework with a real-word dataset.

Item Type:
Journal Article
Journal or Publication Title:
arXiv
ID Code:
183017
Deposited By:
Deposited On:
17 Jan 2023 14:55
Refereed?:
Yes
Published?:
Published
Last Modified:
19 Oct 2023 14:40