Abstract
The integration of graph neural networks (GNNs) and neural ordinary and partial differential equations has been extensively studied in recent years. GNN architectures powered by neural differential equations allow us to reason about their behavior, and develop GNNs with desired properties such as controlled smoothing or energy conservation. In this paper we take inspiration from Turing instabilities in a reaction diffusion (RD) system of partial differential equations, and propose a novel family of GNNs based on neural RD systems, called RDGNN. We show that our RDGNN is powerful for the modeling of various data types, from homophilic, to heterophilic, and spatiotemporal datasets. We discuss the theoretical properties of our RDGNN, its implementation, and show that it improves or offers competitive performance to state-of-the-art methods.
Original language | English |
---|---|
Pages (from-to) | C399-C420 |
Journal | SIAM Journal on Scientific Computing |
Volume | 46 |
Issue number | 4 |
DOIs | |
State | Published - 1 Jan 2024 |
Keywords
- Turing patterns
- graph neural networks
- reaction diffusion
ASJC Scopus subject areas
- Computational Mathematics
- Applied Mathematics