Skip to content

Commit 6b0bf24

Browse files
committed
Add GINConv docs
1 parent 60985a3 commit 6b0bf24

File tree

1 file changed

+45
-0
lines changed

1 file changed

+45
-0
lines changed

GNNLux/src/layers/conv.jl

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1272,6 +1272,51 @@ function Base.show(io::IO, l::GatedGraphConv)
12721272
print(io, ")")
12731273
end
12741274

1275+
@doc raw"""
1276+
GINConv(f, ϵ; aggr=+)
1277+
1278+
Graph Isomorphism convolutional layer from paper [How Powerful are Graph Neural Networks?](https://arxiv.org/pdf/1810.00826.pdf).
1279+
1280+
Implements the graph convolution
1281+
```math
1282+
\mathbf{x}_i' = f_\Theta\left((1 + \epsilon) \mathbf{x}_i + \sum_{j \in N(i)} \mathbf{x}_j \right)
1283+
```
1284+
where ``f_\Theta`` typically denotes a learnable function, e.g. a linear layer or a multi-layer perceptron.
1285+
1286+
# Arguments
1287+
1288+
- `f`: A (possibly learnable) function acting on node features.
1289+
- `ϵ`: Weighting factor.
1290+
1291+
# Examples:
1292+
1293+
```julia
1294+
using GNNLux, Lux, Random
1295+
1296+
# initialize random number generator
1297+
rng = Random.default_rng()
1298+
1299+
# create data
1300+
s = [1,1,2,3]
1301+
t = [2,3,1,1]
1302+
in_channel = 3
1303+
out_channel = 5
1304+
g = GNNGraph(s, t)
1305+
x = randn(rng, Float32, in_channel, g.num_nodes)
1306+
1307+
# create dense layer
1308+
nn = Dense(in_channel, out_channel)
1309+
1310+
# create layer
1311+
l = GINConv(nn, 0.01f0, aggr = mean)
1312+
1313+
# setup layer
1314+
ps, st = LuxCore.setup(rng, l)
1315+
1316+
# forward pass
1317+
y, st = l(g, x, ps, st) # size: out_channel × num_nodes
1318+
```
1319+
"""
12751320
@concrete struct GINConv <: GNNContainerLayer{(:nn,)}
12761321
nn <: AbstractLuxLayer
12771322
ϵ <: Real

0 commit comments

Comments
 (0)