@@ -1339,6 +1339,63 @@ function Base.show(io::IO, l::GINConv)
13391339 print (io, " )" )
13401340end
13411341
1342+ @doc raw """
1343+ GMMConv((in, ein) => out, σ=identity; K = 1, residual = false init_weight = glorot_uniform, init_bias = zeros32, use_bias = true)
1344+
1345+ Graph mixture model convolution layer from the paper [Geometric deep learning on graphs and manifolds using mixture model CNNs](https://arxiv.org/abs/1611.08402)
1346+ Performs the operation
1347+ ```math
1348+ \m athbf{x}_i' = \m athbf{x}_i + \f rac{1}{|N(i)|} \s um_{j\i n N(i)}\f rac{1}{K}\s um_{k=1}^K \m athbf{w}_k(\m athbf{e}_{j\t o i}) \o dot \T heta_k \m athbf{x}_j
1349+ ```
1350+ where ``w^a_{k}(e^a)`` for feature `a` and kernel `k` is given by
1351+ ```math
1352+ w^a_{k}(e^a) = \e xp(-\f rac{1}{2}(e^a - \m u^a_k)^T (\S igma^{-1})^a_k(e^a - \m u^a_k))
1353+ ```
1354+ ``\T heta_k, \m u^a_k, (\S igma^{-1})^a_k`` are learnable parameters.
1355+
1356+ The input to the layer is a node feature array `x` of size `(num_features, num_nodes)` and
1357+ edge pseudo-coordinate array `e` of size `(num_features, num_edges)`
1358+ The residual ``\m athbf{x}_i`` is added only if `residual=true` and the output size is the same
1359+ as the input size.
1360+
1361+ # Arguments
1362+
1363+ - `in`: Number of input node features.
1364+ - `ein`: Number of input edge features.
1365+ - `out`: Number of output features.
1366+ - `σ`: Activation function. Default `identity`.
1367+ - `K`: Number of kernels. Default `1`.
1368+ - `residual`: Residual conncetion. Default `false`.
1369+ - `init_weight`: Weights' initializer. Default `glorot_uniform`.
1370+ - `init_bias`: Bias initializer. Default `zeros32`.
1371+ - `use_bias`: Add learnable bias. Default `true`.
1372+
1373+ # Examples
1374+
1375+ ```julia
1376+ using GNNLux, Lux, Random
1377+
1378+ # initialize random number generator
1379+ rng = Random.default_rng()
1380+
1381+ # create data
1382+ s = [1,1,2,3]
1383+ t = [2,3,1,1]
1384+ g = GNNGraph(s,t)
1385+ nin, ein, out, K = 4, 10, 7, 8
1386+ x = randn(rng, Float32, nin, g.num_nodes)
1387+ e = randn(rng, Float32, ein, g.num_edges)
1388+
1389+ # create layer
1390+ l = GMMConv((nin, ein) => out, K=K)
1391+
1392+ # setup layer
1393+ ps, st = LuxCore.setup(rng, l)
1394+
1395+ # forward pass
1396+ y, st = l(g, x, e, ps, st) # size: out × num_nodes
1397+ ```
1398+ """
13421399@concrete struct GMMConv <: GNNLayer
13431400 σ
13441401 ch:: Pair{NTuple{2, Int}, Int}
0 commit comments