Abstract
Message-Passing Neural Networks (MPNNs) have become a cornerstone for processing and an-alyzing graph-structured data. However, their ef-fectiveness is often hindered by phenomena such as over-squashing, where long-range dependen-cies or interactions are inadequately captured and expressed in the MPNN output. This limitation mirrors the challenges of the Effective Receptive Field (ERF) in Convolutional Neural Networks (CNNs), where the theoretical receptive field is underutilized in practice. In this work, we show and theoretically explain the limited ERF prob-lem in MPNNs. Furthermore, inspired by re-cent advances in ERF augmentation for CNNs, we propose an Interleaved Multiscale Message-Passing Neural Networks (IM-MPNN) architec-ture to address these problems in MPNNs. Our method incorporates a hierarchical coarsening of the graph, enabling message-passing across multiscale representations and facilitating long-range interactions without excessive depth or pa-rameterization. Through extensive evaluations on benchmarks such as the Long-Range Graph Benchmark (LRGB), we demonstrate substantial improvements over baseline MPNNs in captur-ing long-range dependencies while maintaining computational efficiency.
| Original language | English |
|---|---|
| Pages (from-to) | 17203-17220 |
| Number of pages | 18 |
| Journal | Proceedings of Machine Learning Research |
| Volume | 267 |
| State | Published - 1 Jan 2025 |
| Event | 42nd International Conference on Machine Learning, ICML 2025 - Vancouver, Canada Duration: 13 Jul 2025 → 19 Jul 2025 |
ASJC Scopus subject areas
- Software
- Control and Systems Engineering
- Statistics and Probability
- Artificial Intelligence