You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: GNNGraphs/docs/src/index.md
+14-2Lines changed: 14 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
# GNNGraphs.jl
2
2
3
-
GNNGraphs.jl is a package that provides graph data structures and helper functions specifically designed for working with graph neural networks. This package allows to store not only the graph structure, but also features associated with nodes, edges, and the graph itself. It is the core foundation for the GNNlib, GraphNeuralNetworks, and GNNLux packages.
3
+
GNNGraphs.jl is a package that provides graph data structures and helper functions specifically designed for working with graph neural networks. This package allows to store not only the graph structure, but also features associated with nodes, edges, and the graph itself. It is the core foundation for the GNNlib.jl, GraphNeuralNetworks.jl, and GNNLux.jl packages.
4
4
5
5
It supports three types of graphs:
6
6
@@ -12,4 +12,16 @@ It supports three types of graphs:
12
12
13
13
14
14
15
-
This package depends on the package [Graphs.jl] (https://github.com/JuliaGraphs/Graphs.jl).
15
+
This package depends on the package [Graphs.jl] (https://github.com/JuliaGraphs/Graphs.jl).
16
+
17
+
18
+
19
+
## Installation
20
+
21
+
The package can be installed with the Julia package manager.
22
+
From the Julia REPL, type `]` to enter the Pkg REPL mode and run:
Many different types of graphs convolutional layers have been proposed in the literature. Choosing the right layer for your application could involve a lot of exploration.
8
+
Multiple graph convolutional layers are typically stacked together to create a graph neural network model (see [`GNNChain`](@ref)).
9
+
10
+
The table below lists all graph convolutional layers implemented in the *GNNLux.jl*. It also highlights the presence of some additional capabilities with respect to basic message passing:
11
+
-*Sparse Ops*: implements message passing as multiplication by sparse adjacency matrix instead of the gather/scatter mechanism. This can lead to better CPU performances but it is not supported on GPU yet.
-*Edge Features*: supports feature vectors on edges.
14
+
-*Heterograph*: supports heterogeneous graphs (see [`GNNHeteroGraph`](@ref)).
15
+
-*TemporalSnapshotsGNNGraphs*: supports temporal graphs (see [`TemporalSnapshotsGNNGraph`](@ref)) by applying the convolution layers to each snapshot independently.
Takes as input a graph `g`, a node feature matrix `x` of size `[in, num_nodes]`, optionally an edge weight vector and the parameter and state of the layer. Returns a node feature matrix of size
43
+
`[out, num_nodes]`.
44
+
45
+
The `norm_fn` parameter allows for custom normalization of the graph convolution operation by passing a function as argument.
46
+
By default, it computes ``\frac{1}{\sqrt{d}}`` i.e the inverse square root of the degree (`d`) of each node in the graph.
47
+
If `conv_weight` is an `AbstractMatrix` of size `[out, in]`, then the convolution is performed using that weight matrix.
48
+
49
+
# Examples
50
+
51
+
```julia
52
+
using GNNLux, Lux, Random
53
+
# initialize random number generator
54
+
rng = Random.default_rng()
55
+
# create data
56
+
s = [1,1,2,3]
57
+
t = [2,3,1,1]
58
+
g = GNNGraph(s, t)
59
+
x = randn(rng, Float32, 3, g.num_nodes)
60
+
61
+
# create layer
62
+
l = GCNConv(3 => 5)
63
+
64
+
# setup layer
65
+
ps, st = LuxCore.setup(rng, l)
66
+
67
+
# forward pass
68
+
y = l(g, x, ps, st) # size of the output first entry: 5 × num_nodes
69
+
70
+
# convolution with edge weights and custom normalization function
GNNlib.jl is a package that provides the implementation of the basic message passing functions and
4
-
functional implementation of graph convolutional layers, which are used to build graph neural networks in both the Flux.jl and Lux.jl machine learning frameworks, created in the GraphNeuralNetworks.jl and GNNLux.jl packages, respectively.
4
+
functional implementation of graph convolutional layers, which are used to build graph neural networks in both the [Flux.jl](https://fluxml.ai/Flux.jl/stable/) and [Lux.jl](https://lux.csail.mit.edu/stable/) machine learning frameworks, created in the GraphNeuralNetworks.jl and GNNLux.jl packages, respectively.
5
5
6
-
This package depends on GNNGraphs.jl and NNlib.jl, and is primarily intended for developers looking to create new GNN architectures. For most users, the higher-level GraphNeuralNetworks.jl and GNNLux.jl packages are recommended.
6
+
This package depends on GNNGraphs.jl and NNlib.jl, and is primarily intended for developers looking to create new GNN architectures. For most users, the higher-level GraphNeuralNetworks.jl and GNNLux.jl packages are recommended.
7
+
8
+
## Installation
9
+
10
+
The package can be installed with the Julia package manager.
11
+
From the Julia REPL, type `]` to enter the Pkg REPL mode and run:
Copy file name to clipboardExpand all lines: GNNlib/docs/src/messagepassing.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -134,7 +134,7 @@ function (l::GCN)(g::GNNGraph, x::AbstractMatrix{T}) where T
134
134
end
135
135
```
136
136
137
-
See the `GATConv` implementation [here](https://github.com/JuliaGraphs/GraphNeuralNetworks.jl/blob/master/src/layers/conv.jl) for a more complex example.
137
+
See the `GATConv` implementation [here](https://juliagraphs.org/GraphNeuralNetworks.jl/graphneuralnetworks/api/conv/) for a more complex example.
0 commit comments