@@ -16,19 +16,17 @@ and to ``\gamma_x`` and ``\gamma_e`` as to the node update and edge update funct
16
16
respectively. The aggregation `` \square `` is over the neighborhood `` N(i) `` of node `` i `` ,
17
17
and it is usually equal either to `` \sum `` , to ` max ` or to a ` mean ` operation.
18
18
19
- In GraphNeuralNetworks.jl, the function [ ` propagate ` ] ( @ref ) takes care of materializing the
20
- node features on each edge, applying the message function, performing the
19
+ In GraphNeuralNetworks.jl, the message passing mechanism is exposed by the [ ` propagate ` ] ( @ref ) function.
20
+ [ ` propagate ` ] ( @ref ) takes care of materializing the node features on each edge, applying the message function, performing the
21
21
aggregation, and returning `` \bar{\mathbf{m}} `` .
22
22
It is then left to the user to perform further node and edge updates,
23
23
manipulating arrays of size `` D_{node} \times num\_nodes `` and
24
24
`` D_{edge} \times num\_edges `` .
25
25
26
- [ ` propagate ` ] ( @ref ) is composed of two steps corresponding to two
27
- exported methods:
28
- 1 . [ ` apply_edges ` ] ( @ref ) materializes node features on edges and
29
- performs edge-related computation without.
30
- 2 . [ ` aggregate_neighbors ` ] ( @ref ) applies a reduction operator on the messages coming
31
- from the neighborhood of each node.
26
+ [ ` propagate ` ] ( @ref ) is composed of two steps, also available as two independent methods:
27
+
28
+ 1 . [ ` apply_edges ` ] ( @ref ) materializes node features on edges and applyes the message function.
29
+ 2 . [ ` aggregate_neighbors ` ] ( @ref ) applies a reduction operator on the messages coming from the neighborhood of each node.
32
30
33
31
The whole propagation mechanism internally relies on the [ ` NNlib.gather ` ] ( @ref )
34
32
and [ ` NNlib.scatter ` ] ( @ref ) methods.
0 commit comments