Graph Attention Networks: Self-Attention for GNNs

Graph Attention Networks (GATs) are one of the most popular types of Graph Neural Networks. Instead of calculating static weights based on node degrees like Graph Convolutional Networks (GCNs), they assign dynamic weights to node features through a process called self-attention. The main idea behind GATs is that some neighbors are more important than others, regardless of their node degrees. Node 4 is more important than node 3, which is more important than node 2 In this article, we will […]

Read more

GraphSAGE: Scaling up Graph Neural Networks

What do UberEats and Pinterest have in common? They both use GraphSAGE to power their recommender systems on a massive scale: millions and billions of nodes and edges. 🖼️ Pinterest developed its own version called PinSAGE to recommend the most relevant images (pins) to its users. Their graph has 18 billion connections and three billion nodes. 🍽️ UberEats also reported using a modified version of GraphSAGE to suggest dishes, restaurants, and cuisines. UberEats claims to support more than 600,000 restaurants […]

Read more

GIN: How to Design the Most Powerful Graph Neural Network

Graph Neural Networks are not limited to classifying nodes. One of the most popular applications is graph classification. This is a common task when dealing with molecules: they are represented as graphs and features about each atom (node) can be used to predict the behavior of the entire molecule. However, GNNs only learn node embeddings. How to combine them in order to produce an entire graph embedding? In this article, we will: See a new type of layer, called “global […]

Read more