Skip to content

Commit 5600368

Browse files
leave docs of GNNLux as before
1 parent 883f04d commit 5600368

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

GNNLux/src/layers/temporalconv.jl

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,7 @@ function Base.show(io::IO, tgcn::TGCNCell)
5959
end
6060

6161
"""
62-
TGCN(in => out; use_bias = true, init_weight = glorot_uniform, init_state = zeros32, init_bias = zeros32, add_self_loops = false, use_edge_weight = true, act = relu)
62+
TGCN(in => out; use_bias = true, init_weight = glorot_uniform, init_state = zeros32, init_bias = zeros32, add_self_loops = false, use_edge_weight = true, act = sigmoid)
6363
6464
Temporal Graph Convolutional Network (T-GCN) recurrent layer from the paper [T-GCN: A Temporal Graph Convolutional Network for Traffic Prediction](https://arxiv.org/pdf/1811.05320.pdf).
6565
@@ -78,7 +78,7 @@ Performs a layer of GCNConv to model spatial dependencies, followed by a Gated R
7878
If `add_self_loops=true` the new weights will be set to 1.
7979
This option is ignored if the `edge_weight` is explicitly provided in the forward pass.
8080
Default `false`.
81-
- `act`: Activation function used in the GCNConv layer. Default `relu`.
81+
- `act`: Activation function used in the GCNConv layer. Default `sigmoid`.
8282
8383
8484
# Examples
@@ -93,11 +93,11 @@ rng = Random.default_rng()
9393
g = rand_graph(rng, 5, 10)
9494
x = rand(rng, Float32, 2, 5)
9595
96-
# create TGCN layer with default activation (relu)
96+
# create TGCN layer with default activation (sigmoid)
9797
tgcn = TGCN(2 => 6)
9898
9999
# create TGCN layer with custom activation
100-
tgcn_relu = TGCN(2 => 6, act = sigmoid)
100+
tgcn_relu = TGCN(2 => 6, act = relu)
101101
102102
# setup layer
103103
ps, st = LuxCore.setup(rng, tgcn)

0 commit comments

Comments
 (0)