On Oversquashing in Graph Neural Networks Through the Lens of Dynamical Systems

Alessio Gravina, Moshe Eliasof, Claudio Gallicchio, Davide Bacciu, Carola Bibiane Schönlieb

Research output: Contribution to journalConference articlepeer-review

Abstract

A common problem in Message-Passing Neural Networks is oversquashing – the limited ability to facilitate effective information flow between distant nodes. Oversquashing is attributed to the exponential decay in information transmission as node distances increase. This paper introduces a novel perspective to address oversquashing, leveraging dynamical systems properties of global and local non-dissipativity, that enable the maintenance of a constant information flow rate. We present SWAN, a uniquely parameterized GNN model with antisymmetry both in space and weight domains, as a means to obtain non-dissipativity. Our theoretical analysis asserts that by implementing these properties, SWAN offers an enhanced ability to transmit information over extended distances. Empirical evaluations on synthetic and real-world benchmarks that emphasize long-range interactions validate the theoretical understanding of SWAN, and its ability to mitigate oversquashing.

Original languageEnglish
Pages (from-to)16906-16914
Number of pages9
JournalProceedings of the AAAI Conference on Artificial Intelligence
Volume39
Issue number16
DOIs
StatePublished - 11 Apr 2025
Externally publishedYes
Event39th Annual AAAI Conference on Artificial Intelligence, AAAI 2025 - Philadelphia, United States
Duration: 25 Feb 20254 Mar 2025

ASJC Scopus subject areas

  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'On Oversquashing in Graph Neural Networks Through the Lens of Dynamical Systems'. Together they form a unique fingerprint.

Cite this