Skip to content

⚔️ Blades: A Unified Benchmark Suite for Attacks and Defenses in Federated Learning

License

Notifications You must be signed in to change notification settings

lishenghui/blades

Repository files navigation

Blades Logo

Installation

git clone https://github.com/lishenghui/blades
cd blades
pip install -v -e .
# "-v" means verbose, or more output
# "-e" means installing a project in editable mode,
# thus any local modifications made to the code will take effect without reinstallation.
cd blades/blades
python train.py file ./tuned_examples/fedsgd_cnn_fashion_mnist.yaml

Blades internally calls ray.tune; therefore, the experimental results are output to its default directory: ~/ray_results.

Experiment Results

https://github.com/lishenghui/blades/blob/master/docs/source/images/fashion_mnist.png

https://github.com/lishenghui/blades/blob/master/docs/source/images/cifar10.png

Cluster Deployment

To run blades on a cluster, you only need to deploy Ray cluster according to the official guide.

Built-in Implementations

In detail, the following strategies are currently implemented:

Attacks

General Attacks

Strategy Description Sourse
Noise Put random noise to the updates. Sourse
Labelflipping Fang et al. Local Model Poisoning Attacks to Byzantine-Robust Federated Learning, USENIX Security' 20 Sourse
Signflipping Li et al. RSA: Byzantine-Robust Stochastic Aggregation Methods for Distributed Learning from Heterogeneous Datasets, AAAI' 19 Sourse
ALIE Baruch et al. A little is enough: Circumventing defenses for distributed learning NeurIPS' 19 Sourse
IPM Xie et al. Fall of empires: Breaking byzantine- tolerant sgd by inner product manipulation, UAI' 20 Sourse

Adaptive Attacks

Strategy Description Sourse
DistanceMaximization Shejwalkar et al. Manipulating the byzantine: Optimizing model poisoning attacks and defenses for federated learning, NDSS' 21 Sourse

Defenses

Robust Aggregation

Data Partitioners:

Dirichlet Partitioner

https://github.com/lishenghui/blades/blob/master/docs/source/images/dirichlet_partition.png

Sharding Partitioner

Shard Partition

Citation

Please cite our paper (and the respective papers of the methods used) if you use this code in your own work:

@inproceedings{li2024blades,
title={Blades: A Unified Benchmark Suite for Byzantine Attacks and Defenses in Federated Learning},
author={Li, Shenghui and Ngai, Edith and Ye, Fanghua and Ju, Li and Zhang, Tianru and Voigt, Thiemo},
booktitle={2024 IEEE/ACM Ninth International Conference on Internet-of-Things Design and Implementation (IoTDI)},
year={2024}
}