You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: GraphNeuralNetworks/docs/src/index.md
+84-24Lines changed: 84 additions & 24 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,43 +1,107 @@
1
-
# GraphNeuralNetworks Monorepo
1
+
# GraphNeuralNetworks
2
2
3
-
This is the monorepository for the GraphNeuralNetworks project, bringing together all code into a unified structure to facilitate code sharing and reusability across different project components. It contains the following packages:
3
+
GraphNeuralNetworks.jl is a graph neural network package based on the deep learning framework [Flux.jl](https://github.com/FluxML/Flux.jl).
4
4
5
-
-`GraphNeuralNetwork.jl`: Package that contains stateful graph convolutional layers based on the machine learning framework [Flux.jl](https://fluxml.ai/Flux.jl/stable/). This is fronted package for Flux users. It depends on GNNlib.jl, GNNGraphs.jl, and Flux.jl packages.
5
+
It provides a set of stateful graph convolutional layers and utilities to build graph neural networks.
6
6
7
-
-`GNNLux.jl`: Package that contains stateless graph convolutional layers based on the machine learning framework [Lux.jl](https://lux.csail.mit.edu/stable/). This is fronted package for Lux users. It depends on GNNlib.jl, GNNGraphs.jl, and Lux.jl packages.
7
+
Among its features:
8
8
9
-
-`GNNlib.jl`: Package that contains the core graph neural network layers and utilities. It depends on GNNGraphs.jl and GNNlib.jl packages and serves for code base for GraphNeuralNetwork.jl and GNNLux.jl packages.
10
-
11
-
-`GNNGraphs.jl`: Package that contains the graph data structures and helper functions for working with graph data. It depends on Graphs.jl package.
12
-
13
-
Here is a schema of the dependencies between the packages:
14
-
15
-

16
-
17
-
18
-
Among its general features:
19
-
20
-
* Implements common graph convolutional layers both in stateful and stateless form.
9
+
* Implements common graph convolutional layers.
21
10
* Supports computations on batched graphs.
22
11
* Easy to define custom layers.
23
12
* CUDA support.
24
13
* Integration with [Graphs.jl](https://github.com/JuliaGraphs/Graphs.jl).
25
14
*[Examples](https://github.com/JuliaGraphs/GraphNeuralNetworks.jl/tree/master/GraphNeuralNetworks/examples) of node, edge, and graph level machine learning tasks.
26
-
* Heterogeneous and temporal graphs.
15
+
* Heterogeneous and temporal graphs.
16
+
17
+
The package is part of a larger ecosystem of packages that includes [GNNlib.jl](https://juliagraphs.org/GraphNeuralNetworks.jl/gnnlib), [GNNGraphs.jl](https://juliagraphs.org/GraphNeuralNetworks.jl/gnngraphs), and [GNNLux.jl](https://juliagraphs.org/GraphNeuralNetworks.jl/gnnlux).
18
+
19
+
GraphNeuralNetworks.jl is the fronted package for Flux.jl users. [Lux.jl](https://lux.csail.mit.edu/stable/) users instead, can relyi on GNNLux.jl (still in development).
27
20
28
21
## Installation
29
22
30
-
GraphNeuralNetworks.jl, GNNlib.jl and GNNGraphs.jl are a registered Julia packages. You can easily install a package, for example GraphNeuralNetworks.jl, through the package manager :
23
+
GraphNeuralNetworks.jl is a registered Julia package. You can easily install it through the package manager :
31
24
32
25
```julia
33
26
pkg> add GraphNeuralNetworks
34
27
```
35
28
36
-
## Usage
29
+
## Package overview
30
+
31
+
Let's give a brief overview of the package by solving a graph regression problem with synthetic data.
32
+
33
+
Other usage examples can be found in the [examples](https://github.com/JuliaGraphs/GraphNeuralNetworks.jl/tree/master/GraphNeuralNetworks/examples) folder, in the [notebooks](https://github.com/JuliaGraphs/GraphNeuralNetworks.jl/tree/master/GraphNeuralNetworks/notebooks) folder, and in the [tutorials](https://juliagraphs.org/GraphNeuralNetworks.jl/tutorials/) section of the documentation.
34
+
35
+
### Data preparation
36
+
37
+
We create a dataset consisting in multiple random graphs and associated data features.
38
+
39
+
```julia
40
+
using GraphNeuralNetworks, Flux, CUDA, Statistics, MLUtils
41
+
using Flux: DataLoader
42
+
43
+
all_graphs = GNNGraph[]
44
+
45
+
for _ in1:1000
46
+
g =rand_graph(10, 40,
47
+
ndata=(; x =randn(Float32, 16,10)), # Input node features
48
+
gdata=(; y =randn(Float32))) # Regression target
49
+
push!(all_graphs, g)
50
+
end
51
+
```
52
+
53
+
### Model building
37
54
38
-
Usage examples can be found in the [examples](https://github.com/JuliaGraphs/GraphNeuralNetworks.jl/tree/master/GraphNeuralNetworks/examples) and in the [notebooks](https://github.com/JuliaGraphs/GraphNeuralNetworks.jl/tree/master/GraphNeuralNetworks/notebooks) folder. Also, make sure to read the [documentation](https://juliagraphs.org/GraphNeuralNetworks.jl/graphneuralnetworks/) for a comprehensive introduction to the library and the [tutorials](https://juliagraphs.org/GraphNeuralNetworks.jl/tutorials/).
55
+
We concisely define our model as a [`GraphNeuralNetworks.GNNChain`](@ref) containing two graph convolutional layers. If CUDA is available, our model will live on the gpu.
39
56
40
57
58
+
```julia
59
+
device = CUDA.functional() ? Flux.gpu : Flux.cpu;
60
+
61
+
model =GNNChain(GCNConv(16=>64),
62
+
BatchNorm(64), # Apply batch normalization on node features (nodes dimension is batch dimension)
63
+
x ->relu.(x),
64
+
GCNConv(64=>64, relu),
65
+
GlobalPool(mean), # Aggregate node-wise features into graph-wise features
66
+
Dense(64, 1)) |> device
67
+
68
+
opt = Flux.setup(Adam(1f-4), model)
69
+
```
70
+
71
+
### Training
72
+
73
+
Finally, we use a standard Flux training pipeline to fit our dataset.
74
+
We use Flux's `DataLoader` to iterate over mini-batches of graphs
75
+
that are glued together into a single `GNNGraph` using the `MLUtils.batch` method. This is what happens under the hood when creating a `DataLoader` with the
Potential candidates to Google Summer of Code's scholarships can find out about the available projects involving GraphNeuralNetworks.jl on the [dedicated page](https://julialang.org/jsoc/gsoc/gnn/) in the Julia Language website.
104
+
41
105
## Citing
42
106
43
107
If you use GraphNeuralNetworks.jl in a scientific publication, we would appreciate the following reference:
@@ -57,7 +121,3 @@ GraphNeuralNetworks.jl is largely inspired by [PyTorch Geometric](https://pytorc
57
121
and [GeometricFlux.jl](https://fluxml.ai/GeometricFlux.jl/stable/).
0 commit comments