Papers
arxiv:2302.02941

On Over-Squashing in Message Passing Neural Networks: The Impact of Width, Depth, and Topology

Published on Feb 6, 2023
Authors:
,
,
,
,
,

Abstract

Message Passing Neural Networks (MPNNs) are instances of Graph Neural Networks that leverage the graph to send messages over the edges. This inductive bias leads to a phenomenon known as <PRE_TAG>over-squashing</POST_TAG>, where a node feature is insensitive to information contained at distant nodes. Despite recent methods introduced to mitigate this issue, an understanding of the causes for <PRE_TAG>over-squashing</POST_TAG> and of possible solutions are lacking. In this theoretical work, we prove that: (i) Neural network width can mitigate <PRE_TAG>over-squashing</POST_TAG>, but at the cost of making the whole network more sensitive; (ii) Conversely, depth cannot help mitigate <PRE_TAG>over-squashing</POST_TAG>: increasing the number of layers leads to <PRE_TAG>over-squashing</POST_TAG> being dominated by vanishing gradients; (iii) The graph topology plays the greatest role, since <PRE_TAG>over-squashing</POST_TAG> occurs between nodes at high commute (access) time. Our analysis provides a unified framework to study different recent methods introduced to cope with <PRE_TAG>over-squashing</POST_TAG> and serves as a justification for a class of methods that fall under graph rewiring.

Community

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2302.02941 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2302.02941 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2302.02941 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.