# Mix-hop Propagation in GNN

The mix-hop propagation layer has two steps^{1}:

- information propagation step:
$$ \mathbf H^{(k)} = \beta \mathbf H_{in} + (1-\beta)\mathbf L \mathbf H^{(k-1)}, $$

where $\mathbf L= (1+ \operatorname{A}) (\mathbf A + \mathbf I)$. This convolution step tries to disentangle the correlation between the nodes. - information selection step:
$$ \mathbf H_{out} = \sum_k \mathbf H^{(k)} \mathbf W^{(k)}. $$

See Fig 4 in the paper^{1}.

Planted:
by L Ma;

References:

Dynamic Backlinks: