FedDefender: Backdoor Attack Defense in Federated Learning
dc.contributor.author | Gill, Waris | en |
dc.contributor.author | Anwar, Ali | en |
dc.contributor.author | Gulzar, Muhammad Ali | en |
dc.date.accessioned | 2024-03-01T13:18:06Z | en |
dc.date.available | 2024-03-01T13:18:06Z | en |
dc.date.issued | 2023-12-04 | en |
dc.date.updated | 2024-01-01T08:55:46Z | en |
dc.description.abstract | Federated Learning (FL) is a privacy-preserving distributed machine learning technique that enables individual clients (e.g., user participants, edge devices, or organizations) to train a model on their local data in a secure environment and then share the trained model with an aggregator to build a global model collaboratively. In this work, we propose FedDefender, a defense mechanism against targeted poisoning attacks in FL by leveraging differential testing. FedDefender first applies differential testing on clients’ models using a synthetic input. Instead of comparing the output (predicted label), which is unavailable for synthetic input, FedDefender fingerprints the neuron activations of clients’ models to identify a potentially malicious client containing a backdoor. We evaluate FedDefender using MNIST and FashionMNIST datasets with 20 and 30 clients, and our results demonstrate that FedDefender effectively mitigates such attacks, reducing the attack success rate (ASR) to 10% without deteriorating the global model performance. | en |
dc.description.version | Published version | en |
dc.format.mimetype | application/pdf | en |
dc.identifier.doi | https://doi.org/10.1145/3617574.3617858 | en |
dc.identifier.uri | https://hdl.handle.net/10919/118228 | en |
dc.language.iso | en | en |
dc.publisher | ACM | en |
dc.rights | Creative Commons Attribution 4.0 International | en |
dc.rights.holder | The author(s) | en |
dc.rights.uri | http://creativecommons.org/licenses/by/4.0/ | en |
dc.title | FedDefender: Backdoor Attack Defense in Federated Learning | en |
dc.type | Article - Refereed | en |
dc.type.dcmitype | Text | en |