[IJCAI 2024] Papers about graph reduction including graph coarsening, graph condensation, graph sparsification, graph summarization, etc.
-
Updated
Nov 19, 2024
[IJCAI 2024] Papers about graph reduction including graph coarsening, graph condensation, graph sparsification, graph summarization, etc.
[ICLR'22] [KDD'22] [IJCAI'24] Implementation of "Graph Condensation for Graph Neural Networks"
(NeurIPS 2023 spotlight) Large-scale Dataset Distillation/Condensation, 50 IPC (Images Per Class) achieves the highest 60.8% on original ImageNet-1K val set.
Official PyTorch implementation of "Dataset Condensation via Efficient Synthetic-Data Parameterization" (ICML'22)
ICLR 2024, Towards Lossless Dataset Distillation via Difficulty-Aligned Trajectory Matching
Official PyTorch Implementation for the "Distilling Datasets Into Less Than One Image" paper.
Code for Backdoor Attacks Against Dataset Distillation
Awesome Graph Condensation Papers
[ICLR 2024] "Data Distillation Can Be Like Vodka: Distilling More Times For Better Quality" by Xuxi Chen*, Yu Yang*, Zhangyang Wang, Baharan Mirzasoleiman
An Efficient Dataset Condensation Plugin and Its Application to Continual Learning. NeurIPS, 2023.
Dataset Distillation on 3D Point Clouds using Gradient Matching
A collection of dataset distillation papers.
Continual Learning code for SRe2L paper (NeurIPS 2023 spotlight)
Code for our paper "Towards Trustworthy Dataset Distillation" (Pattern Recognition 2025)
Add a description, image, and links to the dataset-distillation topic page so that developers can more easily learn about it.
To associate your repository with the dataset-distillation topic, visit your repo's landing page and select "manage topics."