You only need to configure one file to support model heterogeneity. Consistent GPU memory usage for single or multiple clients.
-
Updated
Dec 24, 2024 - Python
You only need to configure one file to support model heterogeneity. Consistent GPU memory usage for single or multiple clients.
Code and pretrained models for paper: Data-Free Adversarial Distillation
[IJCAI-2021] Contrastive Model Inversion for Data-Free Knowledge Distillation
Official PyTorch implementation of Data-free Knowledge Distillation for Object Detection, WACV 2021.
[ICML2023] Revisiting Data-Free Knowledge Distillation with Poisoned Teachers
FedL2G: Learning to Guide Local Training in Heterogeneous Federated Learning
The official implementation of "DataFreeShield: Defending Adversarial Attacks without Training Data" accepted in ICML 2024.
[ICCV 2023] "TRM-UAP: Enhancing the Transferability of Data-Free Universal Adversarial Perturbation via Truncated Ratio Maximization", Yiran Liu, Xin Feng, Yunlong Wang, Wu Yang, Di Ming*
Add a description, image, and links to the data-free topic page so that developers can more easily learn about it.
To associate your repository with the data-free topic, visit your repo's landing page and select "manage topics."