Over-Smoothing in Graph Neural Networks

Over-smoothing is the problem that the representations on each node of the graph neural networks becomes way too similar to each other.1 In Chapter 7 of Hamilton2020, the author interprets this phenomenon using the lower pass filter theory in signal processing, i.e., multiplying a signal by $\mathbf A^n$ is similar to a low-pass filter when $n$ is large, with $\mathbf A$ being the adjacency matrix.

  1. Hamilton2020 Hamilton WL. Graph Representation Learning. Morgan & Claypool Publishers; 2020. pp. 1–159. doi:10.2200/S01045ED1V01Y202009AIM046  ↩︎

Planted: by ;

No backlinks identified. Reference this note using the Note ID cards/graph/graph-neural-networks-over-smoothing.md in other notes to connect them.

Lei Ma (2022). 'Over-Smoothing in Graph Neural Networks', Datumorphism, 08 April. Available at: https://datumorphism.leima.is/cards/graph/graph-neural-networks-over-smoothing/.