2
2
# "Relational inductive biases, deep learning, and graph networks"
3
3
4
4
"""
5
- propagate(mp, g::GNNGraph, aggr)
6
- propagate(mp, g::GNNGraph, E, X, u, aggr)
5
+ propagate(mp, g, X, E, U, aggr)
7
6
8
- Perform the sequence of operation implementing the message-passing scheme
9
- and updating node, edge, and global features `X`, `E`, and `u` respectively.
7
+ Perform the sequence of operations implementing the message-passing scheme
8
+ on graph `g` with convolution layer `mp`.
9
+ Updates the node, edge, and global features `X`, `E`, and `U` respectively.
10
10
11
11
The computation involved is the following:
12
12
13
13
```julia
14
- M = compute_batch_message(mp, g, E, X, u)
15
- E = update_edge(mp, M, E, u)
14
+ M = compute_batch_message(mp, g, X, E, U)
16
15
M̄ = aggregate_neighbors(mp, aggr, g, M)
17
- X = update(mp, M̄, X, u)
18
- u = update_global(mp, E, X, u)
16
+ X′ = update(mp, X, M̄, U)
17
+ E′ = update_edge(mp, M, E, U)
18
+ U′ = update_global(mp, U, X′, E′)
19
19
```
20
20
21
21
Custom layers typically define their own [`update`](@ref)
22
- and [`message`](@ref) function , then call
22
+ and [`message`](@ref) functions , then call
23
23
this method in the forward pass:
24
24
25
25
```julia
26
26
function (l::MyLayer)(g, X)
27
27
... some prepocessing if needed ...
28
- E = nothing
29
- u = nothing
30
- propagate(l, g, E, X, u, +)
28
+ propagate(l, g, X, E, U, +)
31
29
end
32
30
```
33
31
@@ -36,8 +34,8 @@ See also [`message`](@ref) and [`update`](@ref).
36
34
function propagate end
37
35
38
36
function propagate (mp, g:: GNNGraph , aggr)
39
- E, X , U = propagate (mp, g,
40
- edge_features (g), node_features (g), global_features (g),
37
+ X, E , U = propagate (mp, g,
38
+ node_features (g), edge_features (g), global_features (g),
41
39
aggr)
42
40
GNNGraph (g, ndata= X, edata= E, gdata= U)
43
41
end
0 commit comments