deep_gcns_torch

by lightaime

lightaime / deep_gcns_torch

Pytorch Repo for "DeepGCNs: Can GCNs Go as Deep as CNNs?" ICCV2019 Oral https://www.deepgcns.org

585 Stars 71 Forks Last release: Not found MIT License 233 Commits 0 Releases

Available items

No Items, yet!

The developer of this repository has not created any items for sale yet. Need a bug fixed? Help with integration? A different license? Create a request here:

DeepGCNs: Can GCNs Go as Deep as CNNs?

In this work, we present new ways to successfully train very deep GCNs. We borrow concepts from CNNs, mainly residual/dense connections and dilated convolutions, and adapt them to GCN architectures. Through extensive experiments, we show the positive effect of these deep GCN frameworks.

[Project] [Paper] [Slides] [Tensorflow Code] [Pytorch Code]

Overview

We do extensive experiments to show how different components (#Layers, #Filters, #Nearest Neighbors, Dilation, etc.) effect

DeepGCNs
. We also provide ablation studies on different type of Deep GCNs (MRGCN, EdgeConv, GraphSage and GIN).

Requirements

Install enviroment by runing:

source deepgcn_env_install.sh

Code Architecture

.
├── misc                    # Misc images
├── utils                   # Common useful modules
├── gcn_lib                 # gcn library
│   ├── dense               # gcn library for dense data (B x C x N x 1)
│   └── sparse              # gcn library for sparse data (N x C)
├── examples 
│   ├── modelnet_cls        # code for point clouds classification on ModelNet40
│   ├── sem_seg_dense       # code for point clouds semantic segmentation on S3DIS (data type: dense)
│   ├── sem_seg_sparse      # code for point clouds semantic segmentation on S3DIS (data type: sparse)
│   ├── part_sem_seg        # code for part segmentation on PartNet
│   ├── ppi                 # code for node classification on PPI dataset
│   └── ogb                 # code for node/graph property prediction on OGB datasets
└── ...

How to train, test and evaluate our models

Please look the details in

Readme.md
of each task inside
examples
folder. All the information of code, data, and pretrained models can be found there.

Citation

Please cite our paper if you find anything helpful,

@InProceedings{li2019deepgcns,
    title={DeepGCNs: Can GCNs Go as Deep as CNNs?},
    author={Guohao Li and Matthias Müller and Ali Thabet and Bernard Ghanem},
    booktitle={The IEEE International Conference on Computer Vision (ICCV)},
    year={2019}
}
@misc{li2019deepgcns_journal,
    title={DeepGCNs: Making GCNs Go as Deep as CNNs},
    author={Guohao Li and Matthias Müller and Guocheng Qian and Itzel C. Delgadillo and Abdulellah Abualshour and Ali Thabet and Bernard Ghanem},
    year={2019},
    eprint={1910.06849},
    archivePrefix={arXiv},
    primaryClass={cs.CV}
}
@misc{li2020deepergcn,
    title={DeeperGCN: All You Need to Train Deeper GCNs},
    author={Guohao Li and Chenxin Xiong and Ali Thabet and Bernard Ghanem},
    year={2020},
    eprint={2006.07739},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

License

MIT License

Contact

For more information please contact Guohao Li, Matthias Muller, Guocheng Qian.

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.