Skip to content

Commit 0ec313e

Browse files
committed
update userguide.md
1 parent 73e047b commit 0ec313e

File tree

1 file changed

+34
-34
lines changed

1 file changed

+34
-34
lines changed

docs/src/userguide.md

Lines changed: 34 additions & 34 deletions
Original file line numberDiff line numberDiff line change
@@ -2,23 +2,24 @@
22

33
## Kernel creation
44

5-
To create a kernel chose one of the kernels proposed, see [Base Kernels](@ref), or create your own, see [Creating your own kernel](@ref)
6-
For example to create a square exponential kernel
5+
To create a kernel object, choose one of the pre-implemented kernels, see [Base Kernels](@ref), or create your own, see [Creating your own kernel](@ref).
6+
For example, a square exponential kernel is created by
77
```julia
88
k = SqExponentialKernel()
99
```
10-
Instead of having lengthscale(s) for each kernel we use `Transform` objects (see [Transform](@ref)) which are directly going to act on the inputs before passing them to the kernel.
11-
For example to premultiply the input by 2.0 we create the kernel the following options are possible
10+
Instead of having lengthscale(s) for each kernel we use [`Transform`](@ref) objects which act on the inputs before passing them to the kernel.
11+
For example, to premultiply the input by 2.0 (equivalent to a lengthscale of 0.5) we can use the following options:
1212
```julia
13-
k = transform(SqExponentialKernel(),ScaleTransform(2.0)) # returns a TransformedKernel
14-
k = @kernel SqExponentialKernel() l=2.0 # Will be available soon
15-
k = TransformedKernel(SqExponentialKernel(),ScaleTransform(2.0))
13+
k = transform(SqExponentialKernel(), ScaleTransform(2.0)) # returns a TransformedKernel
14+
k = TransformedKernel(SqExponentialKernel(), ScaleTransform(2.0))
15+
k = @kernel SqExponentialKernel() l=2.0 # Will be available soon
1616
```
17-
Check the [`Transform`](@ref) page to see the other options.
18-
To premultiply the kernel by a variance, you can use `*` or create a `ScaledKernel`
17+
Check the [`Transform`](@ref) page to see all available transforms.
18+
19+
To premultiply the kernel by a variance, you can use `*` or create a `ScaledKernel`:
1920
```julia
2021
k = 3.0*SqExponentialKernel()
21-
k = ScaledKernel(SqExponentialKernel(),3.0)
22+
k = ScaledKernel(SqExponentialKernel(), 3.0)
2223
@kernel 3.0*SqExponentialKernel()
2324
```
2425

@@ -29,60 +30,59 @@ To compute the kernel function on two vectors you can call
2930
k = SqExponentialKernel()
3031
x1 = rand(3)
3132
x2 = rand(3)
32-
k(x1,x2)
33+
k(x1, x2)
3334
```
3435

3536
## Creating a kernel matrix
3637

3738
Kernel matrices can be created via the `kernelmatrix` function or `kerneldiagmatrix` for only the diagonal.
38-
An important argument to give is the dimensionality of the input `obsdim`. It tells if the matrix is of the type `# samples X # features` (`obsdim`=1) or `# features X # samples`(`obsdim`=2) (similarly to [Distances.jl](https://github.com/JuliaStats/Distances.jl))
39+
An important argument to give is the data layout of the input `obsdim`. It specifies whether the number of observed data points is along the first dimension (`obsdim=1`, i.e. the matrix shape is number of samples times number of features) or along the second dimension (`obsdim=2`, i.e. the matrix shape is number of features times number of samples), similarly to [Distances.jl](https://github.com/JuliaStats/Distances.jl). If not given explicitly, `obsdim` defaults to [`defaultobs`](@ref).
3940
For example:
4041
```julia
4142
k = SqExponentialKernel()
42-
A = rand(10,5)
43-
kernelmatrix(k,A,obsdim=1) # Return a 10x10 matrix
44-
kernelmatrix(k,A,obsdim=2) # Return a 5x5 matrix
45-
k(A,obsdim=1) # Syntactic sugar
43+
A = rand(10, 5)
44+
kernelmatrix(k, A, obsdim=1) # returns a 10x10 matrix
45+
kernelmatrix(k, A, obsdim=2) # returns a 5x5 matrix
46+
k(A, obsdim=1) # Syntactic sugar
4647
```
4748

48-
We also support specific kernel matrices outputs:
49+
We also support specific kernel matrix outputs:
4950
- For a positive-definite matrix object`PDMat` from [`PDMats.jl`](https://github.com/JuliaStats/PDMats.jl), you can call the following:
5051
```julia
5152
using PDMats
5253
k = SqExponentialKernel()
53-
K = kernelpdmat(k,A,obsdim=1) # PDMat
54+
K = kernelpdmat(k, A, obsdim=1) # PDMat
5455
```
55-
It will create a matrix and in case of bad conditionning will add some diagonal noise until the matrix is considered PSD, it will then return a `PDMat` object. For this method to work in your code you need to include `using PDMats` first
56+
It will create a matrix and in case of bad conditioning will add some diagonal noise until the matrix is considered positive-definite; it will then return a `PDMat` object. For this method to work in your code you need to include `using PDMats` first.
5657
- For a Kronecker matrix, we rely on [`Kronecker.jl`](https://github.com/MichielStock/Kronecker.jl). Here are two examples:
5758
```julia
5859
using Kronecker
59-
x = range(0,1,length=10)
60-
y = range(0,1,length=50)
61-
K = kernelkronmat(k,[x,y]) # Kronecker matrix
62-
K = kernelkronmat(k,x,5) # Kronecker matrix
60+
x = range(0, 1, length=10)
61+
y = range(0, 1, length=50)
62+
K = kernelkronmat(k, [x, y]) # Kronecker matrix
63+
K = kernelkronmat(k, x, 5) # Kronecker matrix
6364
```
64-
Make sure that `k` is a vector compatible with such constructions (with `iskroncompatible`). Both method will return a . For those methods to work in your code you need to include `using Kronecker` first
65-
- For a Nystrom approximation : `kernelmatrix(nystrom(k, X, ρ, obsdim = 1))` where `ρ` is the proportion of sampled used.
65+
Make sure that `k` is a kernel compatible with such constructions (with `iskroncompatible(k)`). Both methods will return a Kronecker matrix. For those methods to work in your code you need to include `using Kronecker` first.
66+
- For a Nystrom approximation: `kernelmatrix(nystrom(k, X, ρ, obsdim=1))` where `ρ` is the fraction of data samples used in the approximation.
6667

6768
## Composite kernels
6869

69-
One can create combinations of kernels via `KernelSum` and `KernelProduct` or using simple operators `+` and `*`.
70-
For example :
70+
Sums and products of kernels are also valid kernels. They can be created via `KernelSum` and `KernelProduct` or using simple operators `+` and `*`.
71+
For example:
7172
```julia
7273
k1 = SqExponentialKernel()
7374
k2 = Matern32Kernel()
74-
k = 0.5 * k1 + 0.2 * k2 # KernelSum
75-
k = k1 * k2 # KernelProduct
75+
k = 0.5 * k1 + 0.2 * k2 # KernelSum
76+
k = k1 * k2 # KernelProduct
7677
```
7778

78-
## Kernel Parameters
79+
## Kernel parameters
7980

80-
What if you want to differentiate through the kernel parameters? Even in a highly nested structure such as :
81+
What if you want to differentiate through the kernel parameters? Even in a highly nested structure such as:
8182
```julia
82-
k = transform(0.5*SqExponentialKernel()*MaternKernel()+0.2*(transform(LinearKernel(),2.0)+PolynomialKernel()),[0.1,0.5])
83+
k = transform(0.5*SqExponentialKernel()*MaternKernel() + 0.2*(transform(LinearKernel(), 2.0) + PolynomialKernel()), [0.1, 0.5])
8384
```
84-
One can get the array of parameters to optimize via `params` from `Flux.jl`
85-
85+
One can get the array of parameters to optimize via `params` from `Flux.jl`:
8686
```julia
8787
using Flux
8888
params(k)

0 commit comments

Comments
 (0)