Graph Neural Networks

Brian Liu @CRAI 23/09/2021

What is GNN

  • GNN is a class of neural networks that operate directly on graph-structured data by passing node-node messages.
  • Graph has arbitrary size and complex topological structure; no spatial locality like images/grids; No fixed node ordering (node permutation invariance); Often dynamic.

Graph Definition

  • Mathematically, a graph is defined as a tuple of a set of nodes/vertices, and a set of edges/links.
  • Each edge is a pair of two vertices, and represents a connection between them.
\mathcal{E} = \{\varepsilon_1\, ..., \varepsilon_n\}
\varepsilon_i = \{\varepsilon_{ij} = (i, j, \alpha_{i j})\}_{j \in \mathcal{N}(i)}
\mathcal{V} = \{v_1\, ..., v_n\}
v_i = \left(i, x_{i} \in \mathbb{R}^{d}\right)
\mathcal{G}=(A, X)

Summarized node feature matrix.

X = [x_1, ...,x_i,..., x_n], \in \mathbb{R}^{n \times d}

 Adjacency matrix

A = \left[\begin{array}{ccc}{\alpha_{11}} & {\cdots} & {\alpha_{1 n}} \\ {\vdots} & {\ddots} & {\vdots} \\ {\alpha_{n 1}} & {\cdots} & {\alpha_{n n}}\end{array}\right]

Graph Definition

A simple example graph containing 4 nodes

\mathcal{E}=\{(1,2), (2,3), (2,4), (3,4)\}

An undirected graph without self-loop

with its representing adjacency matrix (symmetric) 

A directed graph with self-loop

with its representing adjacency matrix (unsymmetric) 

GNN: message-passing

GNNs rely on message passing methods, which means that vertices exchange information with the neighbors, and send "messages" to each other.

Message passing rules describe how node embeddings are learned. A generalized abstract GNN model can be defined as:

{X}^{(k)} = \operatorname{GNN}\left({X}^{(k-1)}, A\right) = \mathcal{U}\left({X}^{(k-1)}, \mathcal{M}\left({X}^{(k-1)}, A\right)\right)
h^{(k)}_i = \overbrace{\mathcal{M}^{(k)}_{j \in \mathcal{N}(i)} \left(\mathbf{x}_i^{(k-1)}, \mathbf{x}_j^{(k-1)},{\alpha}_{ij}\right)}^{\textbf{neighborhood aggregation function}}
\mathbf{x}_i^{(k)} = \overbrace{\mathcal{U}^{(k)} \left( \mathbf{x}_i^{(k-1)}, h^{(k)}_i \right)}^{\textbf{embedding update function}}


A simply visualization of the message passing process of GNNs

Variants of GNNs

[1] Kipf et al., "Semi-supervised Classification with Graph Convolutional Networks", (ICLR-2017)

[2] Willianm et al.,"Inductive Representation Learning on Large Graphs", (NeurIPS-2017)

[3] Xu, Keyulu, et al. "How powerful are graph neural networks?." (ICLR-2019).

[4] Petar et al., "Graph Attention Networks",(ICLR-2018)

[5] Rex Ying et al., "Hierarchical Graph Representation Learning with Differentiable Pooling", (NeurIPS-2018)

Based on aggregation and update functions

  • Spectral methods: GCN [1], ...
  • Non-spectral / Spatial methods: GraphSAGE [2], GIN [3], ...
  • Attention methods: GAT [5], ...

Based on tasks

  • Graph classification: DiffPool [5], ...
  • Node classification: GCN, GAT, GraphSAGE ..


Graph Convolutional Networks (GCNs) have been introduced by Kipf et al. in 2016 at the University of Amsterdam.

X^{(k)}=\sigma \left(\hat{A} X^{(k-1)} W^{(k)}\right)


GCN implements "message-passing" functions in the graph by a combination of linear transformations over one-hop neighbourhoods and non-linearities as defined:

X^{(k)} \leftarrow \sigma\left(H^{(k)} W^{(k)}\right) .
  • Aggregation function
  • Update function
H^{(k)} \leftarrow\hat{A} X^{(k-1)} ,
\hat{A}=D^{-\frac{1}{2}}(A+I)D^{\frac{1}{2}} ;
D_{ij} = \sum_{j}(A+I)_{ij}
Z^{(k)}=\sigma \left(\hat{A} X^{(k-1)} W^{(k)}\right)

GCNs vs Self-attention

X^{(k)}=\sigma \left(\hat{A} X^{(k-1)} W^{(k)}\right)
\text{Attention}(Q, K, V)=\text{Softmax}\left(\frac{Q K^{T}}{\sqrt{d_k}}\right) V
\hat{A}=D^{-\frac{1}{2}}(A+I)D^{\frac{1}{2}} ;
D_{ij} = \sum_{j}(A+I)_{ij}

Open challenges

  • Depth: Most current GNNs has no more than 3 layers due to over-smoothing issues. Read more...
  • Scale: Computational issues in big data environment, like huge social networks (over billions links) or complex recommendation systems, in which each node has its own graph).  Read more about Cluster-GCN
  • Unstructured data: How to make GNNs possible to take unstructured inputs like Text/Images/Video. Graph generation problems from raw data.

Applications to Medical field

  • Ahmedt-Aristizabal, David, et al. "Graph-Based Deep Learning for Medical Diagnosis and Analysis: Past, Present and Future." arXiv preprint arXiv:2105.13137 (2021).

Applications to CV

  • Make GNNs to take unstructured inputs Text/Images/Video, for tasks like classification, or segmentation, etc.

Applications to CV

  • Use GNNs for segmentation (node-classification)

Liu, Qinghui; Kampffmeyer, Michael; Jenssen, Robert; Salberg, Arnt Børre. "Self-constructing graph neural networks to model long-range pixel dependencies for semantic segmentation of remote sensing images." International Journal of Remote Sensing, vol 42.16, pp 6184-6208, doi:10.1080/01431161.2021.1936267, 2021.

Useful libraris

  • PyG (PyTorch Geometric)  - A library built upon PyTorch to easily write and train Graph Neural Networks (GNNs) for a wide range of applications related to structured data.
  • DGL (Deep graph library) - A high performance and scalable Python package for deep learning on graphs. DGL is framework agnostic, meaning if a deep graph model is a component of an end-to-end application, the rest of the logics can be implemented in any major frameworks, such as PyTorch, MXNet or TensorFlow.

Thanks for your attention!

Brian Liu @CRAI 23/9/2021


By Brian Liu


  • 233