Skip to content

Commit 2527750

Browse files
authored
Fix a few typos (#281)
* Fix a typo in docs/tutorials/index.md * Fix a few typos in node_classification_pluto.jl * Fix a typo in node_classification_cora.jl * Fix a typo in gnngraph.jl * Fix a typo in gnnheterograph.jl * Fix a typo in test_utils.jl
1 parent ee6dbed commit 2527750

File tree

6 files changed

+10
-10
lines changed

6 files changed

+10
-10
lines changed

docs/tutorials/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,4 +16,4 @@ checkout these tutorials from
1616
[PyTorch Geometric](https://pytorch-geometric.readthedocs.io/en/latest/notes/colabs.html).
1717
You are expected to use [Pluto.jl](https://github.com/fonsp/Pluto.jl) notebooks
1818
with [DemoCards.jl](https://github.com/JuliaDocs/DemoCards.jl).
19-
Please check out exsisting tutorials for more details.
19+
Please check out existing tutorials for more details.

docs/tutorials/introductory_tutorials/node_classification_pluto.jl

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ Let us start off by importing some libraries. We will be using Flux.jl and `Grap
4444
# ╔═╡ 0d556a7c-d4b6-4cef-806c-3e1712de0791
4545
md"""
4646
## Visualize
47-
We want to visualize the the outputs of the resutls using t-distributed stochastic neighbor embedding (tsne) to embed our output embeddings onto a 2D plane.
47+
We want to visualize the the outputs of the results using t-distributed stochastic neighbor embedding (tsne) to embed our output embeddings onto a 2D plane.
4848
"""
4949

5050
# ╔═╡ 997b5387-3811-4998-a9d1-7981b58b9e09
@@ -57,11 +57,11 @@ end
5757
md"""
5858
## Dataset: Cora
5959
60-
For our tutorial, we will be using the `Cora` dataset. `Cora` is a citaton network of 2708 documents classified into one of seven classes and 5429 links. Each node represent articles/documents and the edges between these nodes if one of them cite each other.
60+
For our tutorial, we will be using the `Cora` dataset. `Cora` is a citation network of 2708 documents classified into one of seven classes and 5429 links. Each node represent articles/documents and the edges between these nodes if one of them cite each other.
6161
6262
Each publication in the dataset is described by a 0/1-valued word vector indicating the absence/presence of the corresponding word from the dictionary. The dictionary consists of 1433 unique words.
6363
64-
This dataset was first introduced by [Yang et al. (2016)](https://arxiv.org/abs/1603.08861) as one of the datasets of the `Planetoid` benchmark suite. We will be using [MLDatasets.jl](https://juliaml.github.io/MLDatasets.jl/stable/) for an easy accss to this dataset.
64+
This dataset was first introduced by [Yang et al. (2016)](https://arxiv.org/abs/1603.08861) as one of the datasets of the `Planetoid` benchmark suite. We will be using [MLDatasets.jl](https://juliaml.github.io/MLDatasets.jl/stable/) for an easy access to this dataset.
6565
"""
6666

6767
# ╔═╡ edab1e3a-31f6-471f-9835-5b1f97e5cf3f
@@ -77,15 +77,15 @@ dataset.metadata
7777

7878
# ╔═╡ 3438ee7f-bfca-465d-85df-13379622d415
7979
md"""
80-
The `graphs` variable GraphDataset contains the graph. The `Cora` dataaset contains only 1 graph.
80+
The `graphs` variable GraphDataset contains the graph. The `Cora` dataset contains only 1 graph.
8181
"""
8282

8383
# ╔═╡ eec6fb60-0774-4f2a-bcb7-dbc28ab747a6
8484
dataset.graphs
8585

8686
# ╔═╡ bd2fd04d-7fb0-4b31-959b-bddabe681754
8787
md"""
88-
There is only one graph of the dataset. The `node_data` contians `features` indicating if certain words are present or not and `targets` indicating the class for each document. We convert the single-graph dataset to a `GNNGraph`.
88+
There is only one graph of the dataset. The `node_data` contains `features` indicating if certain words are present or not and `targets` indicating the class for each document. We convert the single-graph dataset to a `GNNGraph`.
8989
"""
9090

9191
# ╔═╡ b29c3a02-c21b-4b10-aa04-b90bcc2931d8

test/GNNGraphs/gnngraph.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -338,7 +338,7 @@ end
338338
@test g1 !== g2
339339
end
340340

341-
## Cannot test this because DataStore is not an oredered collection
341+
## Cannot test this because DataStore is not an ordered collection
342342
## Uncomment when/if it will be based on OrderedDict
343343
# @testset "show" begin
344344
# @test sprint(show, rand_graph(10, 20)) == "GNNGraph(10, 20) with no data"

test/GNNGraphs/gnnheterograph.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,7 @@ end
6969
@test hg.num_edges == Dict((:A, :rel1, :B) => 20, (:B, :rel2, :A) => 30)
7070
end
7171

72-
## Cannot test this because DataStore is not an oredered collection
72+
## Cannot test this because DataStore is not an ordered collection
7373
## Uncomment when/if it will be based on OrderedDict
7474
# @testset "show" begin
7575
# num_nodes = Dict(:A => 10, :B => 20);

test/examples/node_classification_cora.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -92,7 +92,7 @@ function train_many(; usecuda = false)
9292
## ("ChebConv", (nin, nout) -> ChebConv(nin => nout, 2)), # not working on gpu
9393
## ("NNConv", (nin, nout) -> NNConv(nin => nout)), # needs edge features
9494
## ("GatedGraphConv", (nin, nout) -> GatedGraphConv(nout, 2)), # needs nin = nout
95-
## ("EdgeConv",(nin, nout) -> EdgeConv(Dense(2nin, nout, relu))), # Fits the traning set but does not generalize well
95+
## ("EdgeConv",(nin, nout) -> EdgeConv(Dense(2nin, nout, relu))), # Fits the training set but does not generalize well
9696
]
9797
@show layer
9898
@time train_res, test_res = train(Layer; usecuda, verbose = false)

test/test_utils.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ end
2020
# Test gradients with respects to layer weights and to input.
2121
# If `g` has edge features, it is assumed that the layer can
2222
# use them in the forward pass as `l(g, x, e)`.
23-
# Test also gradient with repspect to `e`.
23+
# Test also gradient with respect to `e`.
2424
function test_layer(l, g::GNNGraph; atol = 1e-5, rtol = 1e-5,
2525
exclude_grad_fields = [],
2626
verbose = false,

0 commit comments

Comments
 (0)