Graph Neural Networks
Brian Liu @CRAI 23/09/2021
What is GNN
- GNN is a class of neural networks that operate directly on graph-structured data by passing node-node messages.
- Graph has arbitrary size and complex topological structure; no spatial locality like images/grids; No fixed node ordering (node permutation invariance); Often dynamic.
Graph Definition
- Mathematically, a graph is defined as a tuple of a set of nodes/vertices, and a set of edges/links.
- Each edge is a pair of two vertices, and represents a connection between them.
Summarized node feature matrix.
Adjacency matrix
Graph Definition
A simple example graph containing 4 nodes
An undirected graph without self-loop
with its representing adjacency matrix (symmetric)
A directed graph with self-loop
with its representing adjacency matrix (unsymmetric)
GNN: message-passing
GNNs rely on message passing methods, which means that vertices exchange information with the neighbors, and send "messages" to each other.
Message passing rules describe how node embeddings are learned. A generalized abstract GNN model can be defined as:
GNN
A simply visualization of the message passing process of GNNs
Variants of GNNs
[1] Kipf et al., "Semi-supervised Classification with Graph Convolutional Networks", (ICLR-2017)
[2] Willianm et al.,"Inductive Representation Learning on Large Graphs", (NeurIPS-2017)
[3] Xu, Keyulu, et al. "How powerful are graph neural networks?." (ICLR-2019).
[4] Petar et al., "Graph Attention Networks",(ICLR-2018)
[5] Rex Ying et al., "Hierarchical Graph Representation Learning with Differentiable Pooling", (NeurIPS-2018)
Based on aggregation and update functions
- Spectral methods: GCN [1], ...
- Non-spectral / Spatial methods: GraphSAGE [2], GIN [3], ...
- Attention methods: GAT [5], ...
Based on tasks
- Graph classification: DiffPool [5], ...
- Node classification: GCN, GAT, GraphSAGE ..
GCNs
Graph Convolutional Networks (GCNs) have been introduced by Kipf et al. in 2016 at the University of Amsterdam.
GCNs
GCN implements "message-passing" functions in the graph by a combination of linear transformations over one-hop neighbourhoods and non-linearities as defined:
- Aggregation function
- Update function
GCNs vs Self-attention
Open challenges
- Depth: Most current GNNs has no more than 3 layers due to over-smoothing issues. Read more...
- Scale: Computational issues in big data environment, like huge social networks (over billions links) or complex recommendation systems, in which each node has its own graph). Read more about Cluster-GCN
- Dynamic graphs: How to deal with dynamic graphs with structures changing over time. Read more about GraphSAGE
- Unstructured data: How to make GNNs possible to take unstructured inputs like Text/Images/Video. Graph generation problems from raw data.
Applications to Medical field
- Ahmedt-Aristizabal, David, et al. "Graph-Based Deep Learning for Medical Diagnosis and Analysis: Past, Present and Future." arXiv preprint arXiv:2105.13137 (2021).
Applications to CV
- Make GNNs to take unstructured inputs Text/Images/Video, for tasks like classification, or segmentation, etc.
Applications to CV
- Use GNNs for segmentation (node-classification)
Liu, Qinghui; Kampffmeyer, Michael; Jenssen, Robert; Salberg, Arnt Børre. "Self-constructing graph neural networks to model long-range pixel dependencies for semantic segmentation of remote sensing images." International Journal of Remote Sensing, vol 42.16, pp 6184-6208, doi:10.1080/01431161.2021.1936267, 2021.
Useful libraris
- PyG (PyTorch Geometric) - A library built upon PyTorch to easily write and train Graph Neural Networks (GNNs) for a wide range of applications related to structured data.
- DGL (Deep graph library) - A high performance and scalable Python package for deep learning on graphs. DGL is framework agnostic, meaning if a deep graph model is a component of an end-to-end application, the rest of the logics can be implemented in any major frameworks, such as PyTorch, MXNet or TensorFlow.
Thanks for your attention!
Brian Liu @CRAI 23/9/2021
deck
By Brian Liu
deck
- 812