GNN Tony Seale GNN Tony Seale

Transformers and GNNs

Transformers analyse sentences by assigning importance to each word in relation to others, helping them predict or generate the next words in a sentence. This 'attention mechanism' evaluates pairwise interactions between all tokens in a sequence, and these interactions can be seen as edges in a complete graph. Thus, Transformers can be thought of as graph-based models where tokens represent nodes and attention weights represent edges

Read More

Book a free consultation