Documentation  Paper  Colab Notebooks  External Resources  OGB Examples
PyTorch Geometric (PyG) is a geometric deep learning extension library for PyTorch.
It consists of various methods for deep learning on graphs and other irregular structures, also known as geometric deep learning, from a variety of published papers.
In addition, it consists of an easytouse minibatch loader for many small and single giant graphs, multi gpusupport, a large number of common benchmark datasets (based on simple interfaces to create your own), and helpful transforms, both for learning on arbitrary graphs as well as on 3D meshes or point clouds.
Click here to join our Slack community!
OGB is hosting a largescale graph machine learning challenge (OGBLSC) at KDD Cup 2021 from March 15th to June 8th in order to discover innovative solutions for largescale node classification, link prediction and graph regression.
We are looking forward to your participation!
PyTorch Geometric makes implementing Graph Neural Networks a breeze (see here for the accompanying tutorial).
For example, this is all it takes to implement the edge convolutional layer:
import torch
from torch.nn import Sequential as Seq, Linear as Lin, ReLU
from torch_geometric.nn import MessagePassing
class EdgeConv(MessagePassing):
def init(self, F_in, F_out):
super(EdgeConv, self).init(aggr='max') # "Max" aggregation.
self.mlp = Seq(Lin(2 * F_in, F_out), ReLU(), Lin(F_out, F_out))
def forward(self, x, edge_index):
# x has shape [N, F_in]
# edge_index has shape [2, E]
return self.propagate(edge_index, x=x) # shape [N, F_out]
def message(self, x_i, x_j):
# x_i has shape [E, F_in]
# x_j has shape [E, F_in]
edge_features = torch.cat([x_i, x_j  x_i], dim=1) # shape [E, 2 * F_in]
return self.mlp(edge_features) # shape [E, F_out]
In detail, the following methods are currently implemented:

SplineConv from Fey et al.: SplineCNN: Fast Geometric Deep Learning with Continuous BSpline Kernels (CVPR 2018) [Example1, Example2]

GCNConv from Kipf and Welling: SemiSupervised Classification with Graph Convolutional Networks (ICLR 2017) [Example]

GCN2Conv from Chen et al.: Simple and Deep Graph Convolutional Networks (ICML 2020) [Example1, Example2]

ChebConv from Defferrard et al.: Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering (NIPS 2016) [Example]

NNConv from Gilmer et al.: Neural Message Passing for Quantum Chemistry (ICML 2017) [Example1, Example2]

CGConv from Xie and Grossman: Crystal Graph Convolutional Neural Networks for an Accurate and Interpretable Prediction of Material Properties (Physical Review Letters 120, 2018)

ECConv from Simonovsky and Komodakis: EdgeConditioned Convolution on Graphs (CVPR 2017)

EGConv from Tailor et al.: Adaptive Filters and Aggregator Fusion for Efficient Graph Convolutions (GNNSys 2021) [Example]

GATConv from Veličković et al.: Graph Attention Networks (ICLR 2018) [Example]

TransformerConv from Shi et al.: Masked Label Prediction: Unified Message Passing Model for SemiSupervised Classification (CoRR 2020)

SAGEConv from Hamilton et al.: Inductive Representation Learning on Large Graphs (NIPS 2017) [Example1, Example2, Example3]

GraphConv from, e.g., Morris et al.: Weisfeiler and Leman Go Neural: Higherorder Graph Neural Networks (AAAI 2019)

GatedGraphConv from Li et al.: Gated Graph Sequence Neural Networks (ICLR 2016)

ResGatedGraphConv from Bresson and Laurent: Residual Gated Graph ConvNets (CoRR 2017)

GINConv from Xu et al.: How Powerful are Graph Neural Networks? (ICLR 2019) [Example]

GINEConv from Hu et al.: Strategies for Pretraining Graph Neural Networks (ICLR 2020)

ARMAConv from Bianchi et al.: Graph Neural Networks with Convolutional ARMA Filters (CoRR 2019) [Example]

SGConv from Wu et al.: Simplifying Graph Convolutional Networks (CoRR 2019) [Example]

APPNP from Klicpera et al.: Predict then Propagate: Graph Neural Networks meet Personalized PageRank (ICLR 2019) [Example]

MFConv from Duvenaud et al.: Convolutional Networks on Graphs for Learning Molecular Fingerprints (NIPS 2015)

AGNNConv from Thekumparampil et al.: Attentionbased Graph Neural Network for SemiSupervised Learning (CoRR 2017) [Example]

TAGConv from Du et al.: Topology Adaptive Graph Convolutional Networks (CoRR 2017) [Example]

PNAConv from Corso et al.: Principal Neighbourhood Aggregation for Graph Nets (CoRR 2020) [Example]

FAConv from Bo et al.: Beyond LowFrequency Information in Graph Convolutional Networks (AAAI 2021)

RGCNConv from Schlichtkrull et al.: Modeling Relational Data with Graph Convolutional Networks (ESWC 2018) [Example]

FiLMConv from Brockschmidt: GNNFiLM: Graph Neural Networks with Featurewise Linear Modulation (ICML 2020) [Example]

SignedConv from Derr et al.: Signed Graph Convolutional Network (ICDM 2018) [Example]

DNAConv from Fey: Just Jump: Dynamic Neighborhood Aggregation in Graph Neural Networks (ICLRW 2019) [Example]

PANConv from Ma et al.: Path Integral Based Convolution and Pooling for Graph Neural Networks (NeurIPS 2020)

PointConv (including Iterative Farthest Point Sampling, dynamic graph generation based on nearest neighbor or maximum distance, and kNN interpolation for upsampling) from Qi et al.: PointNet: Deep Learning on Point Sets for 3D Classification and Segmentation (CVPR 2017) and PointNet++: Deep Hierarchical Feature Learning on Point Sets in a Metric Space (NIPS 2017) [Example1, Example2]

EdgeConv from Wang et al.: Dynamic Graph CNN for Learning on Point Clouds (CoRR, 2018) [Example1, Example2]

XConv from Li et al.: PointCNN: Convolution On XTransformed Points (NeurIPS 2018) [Example]

PPFConv from Deng et al.: PPFNet: Global Context Aware Local Features for Robust 3D Point Matching (CVPR 2018)

GMMConv from Monti et al.: Geometric Deep Learning on Graphs and Manifolds using Mixture Model CNNs (CVPR 2017)

FeaStConv from Verma et al.: FeaStNet: FeatureSteered Graph Convolutions for 3D Shape Analysis (CVPR 2018)

HypergraphConv from Bai et al.: Hypergraph Convolution and Hypergraph Attention (CoRR 2019)

GravNetConv from Qasim et al.: Learning Representations of Irregular Particledetector Geometry with Distanceweighted Graph Networks (European Physics Journal C, 2019)

SuperGAT from Kim and Oh: How To Find Your Friendly Neighborhood: Graph Attention Design With SelfSupervision (ICLR 2021) [Example]
 A MetaLayer for building any kind of graph network similar to the TensorFlow Graph Nets library from Battaglia et al.: Relational Inductive Biases, Deep Learning, and Graph Networks (CoRR 2018)

GlobalAttention from Li et al.: Gated Graph Sequence Neural Networks (ICLR 2016) [Example]

Set2Set from Vinyals et al.: Order Matters: Sequence to Sequence for Sets (ICLR 2016) [Example]

Sort Pool from Zhang et al.: An EndtoEnd Deep Learning Architecture for Graph Classification (AAAI 2018) [Example]

Dense Differentiable Pooling from Ying et al.: Hierarchical Graph Representation Learning with Differentiable Pooling (NeurIPS 2018) [Example]

Dense MinCUT Pooling from Bianchi et al.: MinCUT Pooling in Graph Neural Networks (CoRR 2019) [Example]

Graclus Pooling from Dhillon et al.: Weighted Graph Cuts without Eigenvectors: A Multilevel Approach (PAMI 2007) [Example]

Voxel Grid Pooling from, e.g., Simonovsky and Komodakis: Dynamic EdgeConditioned Filters in Convolutional Neural Networks on Graphs (CVPR 2017) [Example]

TopK Pooling from Gao and Ji: Graph UNets (ICML 2019), Cangea et al.: Towards Sparse Hierarchical Graph Classifiers (NeurIPSW 2018) and Knyazev et al.: Understanding Attention and Generalization in Graph Neural Networks (ICLRW 2019) [Example]

SAG Pooling from Lee et al.: SelfAttention Graph Pooling (ICML 2019) and Knyazev et al.: Understanding Attention and Generalization in Graph Neural Networks (ICLRW 2019) [Example]

Edge Pooling from Diehl et al.: Towards Graph Pooling by Edge Contraction (ICMLW 2019) and Diehl: Edge Contraction Pooling for Graph Neural Networks (CoRR 2019) [Example]

ASAPooling from Ranjan et al.: ASAP: Adaptive Structure Aware Pooling for Learning Hierarchical Graph Representations (AAAI 2020) [Example]

PANPooling from Ma et al.: Path Integral Based Convolution and Pooling for Graph Neural Networks (NeurIPS 2020)

MemPooling from Khasahmadi et al.: MemoryBased Graph Networks (ICLR 2020)

Local Degree Profile from Cai and Wang: A Simple yet Effective Baseline for Nonattribute Graph Classification (CoRR 2018)

Jumping Knowledge from Xu et al.: Representation Learning on Graphs with Jumping Knowledge Networks (ICML 2018) [Example]

Node2Vec from Grover and Leskovec: node2vec: Scalable Feature Learning for Networks (KDD 2016) [Example]

MetaPath2Vec from Dong et al.: metapath2vec: Scalable Representation Learning for Heterogeneous Networks (KDD 2017) [Example]

Deep Graph Infomax from Veličković et al.: Deep Graph Infomax (ICLR 2019) [Example1, Example2]
 All variants of Graph Autoencoders and Variational Autoencoders from:

SEAL from Zhang and Chen: Link Prediction Based on Graph Neural Networks (NeurIPS 2018)

RENet from Jin et al.: Recurrent Event Network for Reasoning over Temporal Knowledge Graphs (ICLRW 2019) [Example]

GraphUNet from Gao and Ji: Graph UNets (ICML 2019) [Example]

SchNet from Schütt et al.: SchNet: A Continuousfilter Convolutional Neural Network for Modeling Quantum Interactions (NIPS 2017) [Example]

DimeNet from Klicpera et al.: Directional Message Passing for Molecular Graphs (ICLR 2020) [Example]

AttentiveFP from Xiong et al.: Pushing the Boundaries of Molecular Representation for Drug Discovery with the Graph Attention Mechanism (J. Med. Chem. 2020) [Example]

DeepGCN and the GENConv from Li et al.: DeepGCNs: Can GCNs Go as Deep as CNNs? (ICCV 2019) and DeeperGCN: All You Need to Train Deeper GCNs (CoRR 2020) [Example]

NeighborSampler from Hamilton et al.: Inductive Representation Learning on Large Graphs (NIPS 2017) [Example1, Example2, Example3]

ClusterGCN from Chiang et al.: ClusterGCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks (KDD 2019) [Example1, Example2]

GraphSAINT from Zeng et al.: GraphSAINT: Graph Sampling Based Inductive Learning Method (ICLR 2020) [Example]

ShaDow from Zeng et al.: Deep Graph Neural Networks with Shallow Subgraph Samplers (CoRR 2020)

GDC from Klicpera et al.: Diffusion Improves Graph Learning (NeurIPS 2019) [Example]

SIGN from Rossi et al.: SIGN: Scalable Inception Graph Neural Networks (CoRR 2020) [Example]

GNNExplainer from Ying et al.: GNNExplainer: Generating Explanations for Graph Neural Networks (NeurIPS 2019) [Example]

DropEdge from Rong et al.: DropEdge: Towards Deep Graph Convolutional Networks on Node Classification (ICLR 2020)

GraphNorm from Cai et al.: GraphNorm: A Principled Approach to Accelerating Graph Neural Network Training (CoRR 2020)

GraphSizeNorm from Dwivedi et al.: Benchmarking Graph Neural Networks (CoRR 2020)

PairNorm from Zhao and Akoglu: PairNorm: Tackling Oversmoothing in GNNs (ICLR 2020)

DiffGroupNorm from Zhou et al.: Towards Deeper Graph Neural Networks with Differentiable Group Normalization (NeurIPS 2020)

Tree Decomposition from Jin et al.: Junction Tree Variational Autoencoder for Molecular Graph Generation (ICML 2018)

TGN from Rossi et al.: Temporal Graph Networks for Deep Learning on Dynamic Graphs (GRL+ 2020) [Example]

Weisfeiler Lehman Algorithm from Weisfeiler and Lehman: A Reduction of a Graph to a Canonical Form and an Algebra Arising During this Reduction (NauchnoTechnicheskaya Informatsia 1968) [Example]

Label Propagation from Zhu and Ghahramani: Learning from Labeled and Unlabeled Data with Label Propagation (CMUCALD 2002) [Example]

CorrectAndSmooth from Huang et al.: Combining Label Propagation And Simple Models Outperforms Graph Neural Networks (CoRR 2020) [Example]
Head over to our documentation to find out more about installation, data handling, creation of datasets and a full list of implemented methods, transforms, and datasets.
For a quick start, check out our examples in the
examples/
directory.
If you notice anything unexpected, please open an issue and let us know.
If you have any questions or are missing a specific feature, feel free to discuss them with us.
We are motivated to constantly make PyTorch Geometric even better.
Installation
We provide pip wheels for all major OS/PyTorch/CUDA combinations, see here.
PyTorch 1.8.0/1.8.1
To install the binaries for PyTorch 1.8.0 and 1.8.1, simply run
$ pip install torchscatter f https://pytorchgeometric.com/whl/torch1.8.0+${CUDA}.html
$ pip install torchsparse f https://pytorchgeometric.com/whl/torch1.8.0+${CUDA}.html
$ pip install torchcluster f https://pytorchgeometric.com/whl/torch1.8.0+${CUDA}.html
$ pip install torchsplineconv f https://pytorchgeometric.com/whl/torch1.8.0+${CUDA}.html
$ pip install torchgeometric
where
${CUDA}
should be replaced by either cpu
, cu101
, cu102
, or cu111
depending on your PyTorch installation.
Binaries are provided for Python version <= 3.8
.
 
cpu
 cu101
 cu102
 cu111


 Linux  ✅  ✅  ✅  ✅ 
 Windows  ✅  ✅  ✅  ✅ 
 macOS  ✅    
PyTorch 1.7.0/1.7.1
To install the binaries for PyTorch 1.7.0 and 1.7.1, simply run
pip install torchscatter f https://pytorchgeometric.com/whl/torch1.7.0+${CUDA}.html
pip install torchsparse f https://pytorchgeometric.com/whl/torch1.7.0+${CUDA}.html
pip install torchcluster f https://pytorchgeometric.com/whl/torch1.7.0+${CUDA}.html
pip install torchsplineconv f https://pytorchgeometric.com/whl/torch1.7.0+${CUDA}.html
pip install torchgeometric
where
${CUDA}
should be replaced by either cpu
, cu92
, cu101
, cu102
, or cu110
depending on your PyTorch installation.
Binaries are provided for Python version <= 3.8
.
 
cpu
 cu92
 cu101
 cu102
 cu110


 Linux  ✅  ✅  ✅  ✅  ✅ 
 Windows  ✅  ❌  ✅  ✅  ✅ 
 macOS  ✅     
Note: Binaries of older versions are also provided for PyTorch 1.4.0, PyTorch 1.5.0 and PyTorch 1.6.0 (following the same procedure).
From master
In case you want to experiment with the latest PyG features which are not fully released yet, you can install PyG from master via
pip install git+https://github.com/rusty1s/pytorch_geometric.git
Running examples
cd examples
python gcn.py
Cite
Please cite our paper (and the respective papers of the methods used) if you use this code in your own work:
@inproceedings{Fey/Lenssen/2019,
title={Fast Graph Representation Learning with {PyTorch Geometric}},
author={Fey, Matthias and Lenssen, Jan E.},
booktitle={ICLR Workshop on Representation Learning on Graphs and Manifolds},
year={2019},
}
Feel free to email us if you wish your work to be listed in the external resources.
Running tests
python setup.py test