Return to search

A Bridge between Graph Neural Networks and Transformers: Positional Encodings as Node Embeddings

Graph Neural Networks and Transformers are very powerful frameworks for learning machine learning tasks. While they were evolved separately in diverse fields, current research has revealed some similarities and links between them. This work focuses on bridging the gap between GNNs and Transformers by offering a uniform framework that highlights their similarities and distinctions. We perform positional encodings and identify key properties that make the positional encodings node embeddings. We found that the properties of expressiveness, efficiency and interpretability were achieved in the process. We saw that it is possible to use positional encodings as node embeddings, which can be used for machine learning tasks such as node classification, graph classification, and link prediction. We discuss some challenges and provide future directions.

Identiferoai:union.ndltd.org:ETSU/oai:dc.etsu.edu:etd-5845
Date01 December 2023
CreatorsManu, Bright Kwaku
PublisherDigital Commons @ East Tennessee State University
Source SetsEast Tennessee State University
LanguageEnglish
Detected LanguageEnglish
Typetext
Formatapplication/pdf
SourceElectronic Theses and Dissertations
RightsCopyright by the authors.

Page generated in 0.0019 seconds