In the recent years there has been an increasing number of success stories in applying deep learning techniques to graph-structured data.
The main workhorse in this emerging field is the graph neural network: a message passing algorithm parameterized by neural networks, trained via
backpropagation. Variants of graph neural networks now define the state of the art in many classical graph or network problems, such as node classification, graph classification, and link prediction.
In this talk, I will give an overview of structured deep models that employ graph neural networks as a key component and discuss trade-offs for a few popular model variants such as graph convolutional networks (GCNs)  and graph attention networks (GATs) .
I will further introduce three emerging research directions: learning deep generative models of graphs, inference of latent graph structure, and hierarchical
concept learning (“learning to pool”). Structured deep models are ideal candidates for these areas and hold great promise for applications such as program induction, chemical synthesis, causal inference, and
interacting physical and multi-agent systems.
 Kipf & Welling, Semi-supervised classification with graph
convolutional networks, ICLR 2017
 Veličković et al., Graph attention networks, ICLR 2018
- Speaker: Thomas Kipf, University of Amsterdam
- Friday 25 May 2018, 14:00–15:00
- Venue: Lecture Theatre 2, Cambridge University Computer Laboratory, J J Thompson Avenue, Madingley Road, Cambridge.
- Series: CL-CompBio; organiser: Petar Veličković.