Elias W. BA
Writing hieroglyphs. Building pyramids.
@eliaswalyba
Machine Learning Teacher & Engineer
Graphs are all around us; real world objects are often defined in terms of their connections to other things.
A set of objects, and the connections between them, are naturally expressed as a graph.
Graph neural networks (GNNs) are a family of neural networks that can operate naturally on graph-structured data. By extracting and utilizing features from the underlying graph, GNNs can make more informed predictions about entities in these interactions, as compared to models that consider individual entities in isolation.
A GNN is an optimizable transformation on all attributes of the graph (nodes, edges, global-context) that preserves graph symmetries (permutation invariances)
Text
Text
Text
The structure doesn't change, only the embedding do.
GNN are very similar to Encoder-Decoder models.
Example: When the model is node centered and we need to predict on nodes.
We may need node information in edges and vice-versa.
Collect information from nodes to edges and vice-versa.
Edge centered to Node prediction
Node centered to Edge prediction
Node & Edge centered to Global State prediction
Graphs are beast modelers. Neural networks are witch predictors. Graph neural networks are the kids of their mariage.
😎
Let's implement a simple GNN using Keras
By Elias W. BA