Robust Knowledge Distillation in Federated Learning : Counteracting Backdoor Attacks

Alharbi, Ebtisaam and Soriano Marcolino, Leandro and Ni, Qiang and Gouglidis, Antonios (2024) Robust Knowledge Distillation in Federated Learning : Counteracting Backdoor Attacks. In: Proceedings of the IEEE Conference on Secure and Trustworthy Machine Learning (SaTML) :. IEEE. (In Press)

[thumbnail of SaTML25__Robust_Knowledge_Distillation_in_Federated_Learning__Counteracting_Backdoor_Attacks_ (27)]
Text (SaTML25__Robust_Knowledge_Distillation_in_Federated_Learning__Counteracting_Backdoor_Attacks_ (27))
SaTML25_Robust_Knowledge_Distillation_in_Federated_Learning_Counteracting_Backdoor_Attacks_27_.pdf - Accepted Version
Available under License Creative Commons Attribution.

Download (1MB)

Abstract

Federated Learning (FL) enables collaborative model training across multiple devices while preserving data privacy. However, it remains susceptible to backdoor attacks, where malicious participants can compromise the global model. Existing defence methods are limited by strict assumptions on data heterogeneity (Non-Independent and Identically Distributed data) and the proportion of malicious clients, reducing their practicality and effectiveness. To overcome these limitations, we propose Robust Knowledge Distillation (RKD), a novel defence mechanism that enhances model integrity without relying on restrictive assumptions. RKD integrates clustering and model selection techniques to identify and filter out malicious updates, forming a reliable ensemble of models. It then employs knowledge distillation to transfer the collective insights from this ensemble to a global model. Extensive evaluations demonstrate that RKD effectively mitigates backdoor threats while maintaining high model performance, outperforming current state-of-the-art defence methods across various scenarios.

Item Type:
Contribution in Book/Report/Proceedings
Uncontrolled Keywords:
Research Output Funding/no_not_funded
Subjects:
?? no - not fundedno ??
ID Code:
227784
Deposited By:
Deposited On:
25 Feb 2025 11:45
Refereed?:
Yes
Published?:
In Press
Last Modified:
26 Mar 2025 00:51