Skip to content

The official implementation for NeurIPS2023 paper "SGFormer: Simplifying and Empowering Transformers for Large-Graph Representations"

Notifications You must be signed in to change notification settings

qitianwu/SGFormer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SGFormer: Simplified Graph Transformers

The official implementation for NeurIPS23 paper "SGFormer: Simplifying and Empowering Transformers for Large-Graph Representations".

Related material: [Paper], [Blog], [Video]

SGFormer is a graph encoder backbone that efficiently computes all-pair interactions with one-layer attentive propagation.

SGFormer is built upon our previous works on scalable graph Transformers with linear complexity NodeFormer (NeurIPS22, spotlight) and DIFFormer (ICLR23, spotlight).

What's news

[2023.10.28] We release the code for the model on large graph benchmarks. More detailed info will be updated soon.

[2023.12.20] We supplement more details for how to run the code.

[2024.05.05] We supplement the code for testing time and memory in ./medium/time_test.py

Model and Results

The model adopts a simple architecture and is comprised of a one-layer global attention and a shallow GNN.

image

The following tables present the results for standard node classification tasks on medium-sized and large-sized graphs.

image image

Requirements

For datasets except ogbn-papers100M, we used the environment with package versions indicated in ./large/requirement.txt. For ogbn-papers100M, one needs PyG version >=2.0 for running the code.

Dataset

One can download the datasets (Planetoid, Deezer, Pokec, Actor/Film) from the google drive link below:

https://drive.google.com/drive/folders/1rr3kewCBUvIuVxA6MJ90wzQuF-NnCRtf?usp=drive_link

For Chameleon and Squirrel, we use the new splits that filter out the overlapped nodes.

For the OGB datasets, they will be downloaded automatically when running the code.

Run the codes

Please refer to the bash script run.sh in each folder for running the training and evaluation pipeline.

Citation

If you find our code and model useful, please cite our work. Thank you!

      @inproceedings{
        wu2023sgformer,
        title={SGFormer: Simplifying and Empowering Transformers for Large-Graph Representations},
        author={Qitian Wu and Wentao Zhao and Chenxiao Yang and Hengrui Zhang and Fan Nie and Haitian Jiang and Yatao Bian and Junchi Yan},
        booktitle={Advances in Neural Information Processing Systems (NeurIPS)},
        year={2023}
        }

About

The official implementation for NeurIPS2023 paper "SGFormer: Simplifying and Empowering Transformers for Large-Graph Representations"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published