Skip to content

Commit 397fb06

Browse files
committed
Minor improvements to the docs
1 parent d6eab2b commit 397fb06

File tree

5 files changed

+43
-36
lines changed

5 files changed

+43
-36
lines changed

docs/src/create_kernel.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -2,9 +2,9 @@
22

33
KernelFunctions.jl contains the most popular kernels already but you might want to make your own!
44

5-
Here are a few ways depending on how complicated your kernel is :
5+
Here are a few ways depending on how complicated your kernel is:
66

7-
### SimpleKernel for kernels function depending on a metric
7+
### SimpleKernel for kernel functions depending on a metric
88

99
If your kernel function is of the form `k(x, y) = f(d(x, y))` where `d(x, y)` is a `PreMetric`,
1010
you can construct your custom kernel by defining `kappa` and `metric` for your kernel.
@@ -20,15 +20,15 @@ KernelFunctions.metric(::MyKernel) = SqEuclidean()
2020
### Kernel for more complex kernels
2121

2222
If your kernel does not satisfy such a representation, all you need to do is define `(k::MyKernel)(x, y)` and inherit from `Kernel`.
23-
For example we recreate here the `NeuralNetworkKernel`
23+
For example, we recreate here the `NeuralNetworkKernel`:
2424

2525
```julia
2626
struct MyKernel <: KernelFunctions.Kernel end
2727

2828
(::MyKernel)(x, y) = asin(dot(x, y) / sqrt((1 + sum(abs2, x)) * (1 + sum(abs2, y))))
2929
```
3030

31-
Note that `BaseKernel` do not use `Distances.jl` and can therefore be a bit slower.
31+
Note that the fallback implementation of the base `Kernel` evaluation does not use `Distances.jl` and can therefore be a bit slower.
3232

3333
### Additional Options
3434

@@ -37,7 +37,7 @@ Finally there are additional functions you can define to bring in more features:
3737
- `KernelFunctions.dim(x::MyDataType)`: by default the dimension of the inputs will only be checked for vectors of type `AbstractVector{<:Real}`. If you want to check the dimensionality of your inputs, dispatch the `dim` function on your datatype. Note that `0` is the default.
3838
- `dim` is called within `KernelFunctions.validate_inputs(x::MyDataType, y::MyDataType)`, which can instead be directly overloaded if you want to run special checks for your input types.
3939
- `kernelmatrix(k::MyKernel, ...)`: you can redefine the diverse `kernelmatrix` functions to eventually optimize the computations.
40-
- `Base.print(io::IO, k::MyKernel)`: if you want to specialize the printing of your kernel
40+
- `Base.print(io::IO, k::MyKernel)`: if you want to specialize the printing of your kernel.
4141

4242
KernelFunctions uses [Functors.jl](https://github.com/FluxML/Functors.jl) for specifying trainable kernel parameters
4343
in a way that is compatible with the [Flux ML framework](https://github.com/FluxML/Flux.jl).

docs/src/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# KernelFunctions.jl
22

3-
Model agnostic kernel functions compatible with automatic differentiation
3+
Model-agnostic kernel functions compatible with automatic differentiation
44

55
**KernelFunctions.jl** is a general purpose kernel package.
66
It aims at providing a flexible framework for creating kernels and manipulating them.

docs/src/kernels.md

Lines changed: 28 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44

55
# Base Kernels
66

7-
These are the basic kernels without any transformation of the data. They are the building blocks of KernelFunctions
7+
These are the basic kernels without any transformation of the data. They are the building blocks of KernelFunctions.
88

99

1010
## Constant Kernels
@@ -86,7 +86,7 @@ The [`FBMKernel`](@ref) is defined as
8686
k(x,x';h) = \frac{|x|^{2h} + |x'|^{2h} - |x-x'|^{2h}}{2},
8787
```
8888

89-
where $h$ is the [Hurst index](https://en.wikipedia.org/wiki/Hurst_exponent#Generalized_exponent) and $0<h<1$.
89+
where $h$ is the [Hurst index](https://en.wikipedia.org/wiki/Hurst_exponent#Generalized_exponent) and $0 < h < 1$.
9090

9191
## Gabor Kernel
9292

@@ -96,11 +96,11 @@ The [`GaborKernel`](@ref) is defined as
9696
k(x,x'; l,p) =& h(x-x';l,p)\\
9797
h(u;l,p) =& \exp\left(-\cos\left(\pi \sum_i \frac{u_i}{p_i}\right)\sum_i \frac{u_i^2}{l_i^2}\right),
9898
```
99-
where $l_i >0 $ is the lengthscale and $p_i>0$ is the period.
99+
where $l_i > 0$ is the lengthscale and $p_i > 0$ is the period.
100100

101-
## Matern Kernels
101+
## Matérn Kernels
102102

103-
### Matern Kernel
103+
### General Matérn Kernel
104104

105105
The [`MaternKernel`](@ref) is defined as
106106

@@ -110,15 +110,23 @@ The [`MaternKernel`](@ref) is defined as
110110

111111
where $\nu > 0$.
112112

113-
### Matern 3/2 Kernel
113+
### Matérn 1/2 Kernel
114+
115+
The Matérn 1/2 kernel is defined as
116+
```math
117+
k(x,x') = \exp\left(-|x-x'|\right).
118+
```
119+
It is equivalent to the [`ExponentialKernel`](@ref).
120+
121+
### Matérn 3/2 Kernel
114122

115123
The [`Matern32Kernel`](@ref) is defined as
116124

117125
```math
118126
k(x,x') = \left(1+\sqrt{3}|x-x'|\right)\exp\left(\sqrt{3}|x-x'|\right).
119127
```
120128

121-
### Matern 5/2 Kernel
129+
### Matérn 5/2 Kernel
122130

123131
The [`Matern52Kernel`](@ref) is defined as
124132

@@ -128,7 +136,7 @@ The [`Matern52Kernel`](@ref) is defined as
128136

129137
## Neural Network Kernel
130138

131-
The [`NeuralNetworkKernel`](@ref) (as in the kernel for an infinitely wide neural network interpretated as a Gaussian process) is defined as
139+
The [`NeuralNetworkKernel`](@ref) (as in the kernel for an infinitely wide neural network interpreted as a Gaussian process) is defined as
132140

133141
```math
134142
k(x, x') = \arcsin\left(\frac{\langle x, x'\rangle}{\sqrt{(1+\langle x, x\rangle)(1+\langle x',x'\rangle)}}\right).
@@ -142,7 +150,7 @@ The [`PeriodicKernel`](@ref) is defined as
142150
k(x,x';r) = \exp\left(-0.5 \sum_i (sin (π(x_i - x'_i))/r_i)^2\right),
143151
```
144152

145-
where $r$ has the same dimension as $x$ and $r_i >0$.
153+
where $r$ has the same dimension as $x$ and $r_i > 0$.
146154

147155
## Piecewise Polynomial Kernel
148156

@@ -153,7 +161,7 @@ The [`PiecewisePolynomialKernel`](@ref) is defined as
153161
r =& x^\top P x',\\
154162
j =& \lfloor \frac{D}{2}\rfloor + V + 1,
155163
```
156-
where $x\in \mathbb{R}^D$, $V \in \{0,1,2,3\} and $P$ is a positive definite matrix.
164+
where $x\in \mathbb{R}^D$, $V \in \{0,1,2,3\} and $P$ is a positive-definite matrix.
157165
$f$ is a piecewise polynomial (see source code).
158166

159167
## Polynomial Kernels
@@ -166,7 +174,7 @@ The [`LinearKernel`](@ref) is defined as
166174
k(x,x';c) = \langle x,x'\rangle + c,
167175
```
168176

169-
where $c \in \mathbb{R}$
177+
where $c \in \mathbb{R}$.
170178

171179
### Polynomial Kernel
172180

@@ -176,7 +184,7 @@ The [`PolynomialKernel`](@ref) is defined as
176184
k(x,x';c,d) = \left(\langle x,x'\rangle + c\right)^d,
177185
```
178186

179-
where $c \in \mathbb{R}$ and $d>0$
187+
where $c \in \mathbb{R}$ and $d>0$.
180188

181189

182190
## Rational Quadratic
@@ -223,43 +231,41 @@ where $i\in\{-1,0,1,2,3\}$ and coefficients $a_i$, $b_i$ are fixed and residuals
223231

224232
### Transformed Kernel
225233

226-
The [`TransformedKernel`](@ref) is a kernel where input are transformed via a function `f`
234+
The [`TransformedKernel`](@ref) is a kernel where inputs are transformed via a function `f`:
227235

228236
```math
229237
k(x,x';f,\widetile{k}) = \widetilde{k}(f(x),f(x')),
230238
```
231-
232-
Where $\widetilde{k}$ is another kernel and $f$ is an arbitrary mapping.
239+
where $\widetilde{k}$ is another kernel and $f$ is an arbitrary mapping.
233240

234241
### Scaled Kernel
235242

236243
The [`ScaledKernel`](@ref) is defined as
237244

238245
```math
239-
k(x,x';\sigma^2,\widetilde{k}) = \sigma^2\widetilde{k}(x,x')
246+
k(x,x';\sigma^2,\widetilde{k}) = \sigma^2\widetilde{k}(x,x') ,
240247
```
241-
242-
Where $\widetilde{k}$ is another kernel and $\sigma^2 > 0$.
248+
where $\widetilde{k}$ is another kernel and $\sigma^2 > 0$.
243249

244250
### Kernel Sum
245251

246-
The [`KernelSum`](@ref) is defined as a sum of kernels
252+
The [`KernelSum`](@ref) is defined as a sum of kernels:
247253

248254
```math
249255
k(x, x'; \{k_i\}) = \sum_i k_i(x, x').
250256
```
251257

252-
### KernelProduct
258+
### Kernel Product
253259

254-
The [`KernelProduct`](@ref) is defined as a product of kernels
260+
The [`KernelProduct`](@ref) is defined as a product of kernels:
255261

256262
```math
257263
k(x,x';\{k_i\}) = \prod_i k_i(x,x').
258264
```
259265

260266
### Tensor Product
261267

262-
The [`TensorProduct`](@ref) is defined as :
268+
The [`TensorProduct`](@ref) is defined as:
263269

264270
```math
265271
k(x,x';\{k_i\}) = \prod_i k_i(x_i,x'_i)

docs/src/metrics.md

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,16 +1,17 @@
11
# Metrics
22

33
KernelFunctions.jl relies on [Distances.jl](https://github.com/JuliaStats/Distances.jl) for computing the pairwise matrix.
4-
To do so a distance measure is needed for each kernel. Two very common ones can already be used : `SqEuclidean` and `Euclidean`.
5-
However all kernels do not rely on distances metrics respecting all the definitions. That's why additional metrics come with the package such as `DotProduct` (`<x,y>`) and `Delta` (`δ(x,y)`).
6-
Note that every `SimpleKernel` must have a defined metric defined as :
4+
To do so a distance measure is needed for each kernel. Two very common ones can already be used: `SqEuclidean` and `Euclidean`.
5+
However, not all kernels rely on distance metrics respecting all the definitions as in Distances.jl. For this reason, KernelFunctions.jl provides additional "metrics" such as `DotProduct` ($\langle x, y \rangle$) and `Delta` ($\delta(x,y)$).
6+
7+
Note that every `SimpleKernel` must have a metric, specified as
78
```julia
89
KernelFunctions.metric(::CustomKernel) = SqEuclidean()
910
```
1011

1112
## Adding a new metric
1213

13-
If you want to create a new distance just implement the following :
14+
If you want to create a new "metric" just implement the following:
1415

1516
```julia
1617
struct Delta <: Distances.PreMetric

docs/src/transform.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,9 @@
1-
# Transform
1+
# Input Transforms
22

33
`Transform` is the object that takes care of transforming the input data before distances are being computed. It can be as standard as `IdentityTransform` returning the same input, or multiplying the data by a scalar with `ScaleTransform` or by a vector with `ARDTransform`.
4-
There is a more general `Transform`: `FunctionTransform` that uses a function and apply it on each vector via `mapslices`.
5-
You can also create a pipeline of `Transform` via `TransformChain`. For example `LowRankTransform(rand(10,5))∘ScaleTransform(2.0)`.
4+
There is a more general `Transform`: `FunctionTransform` that uses a function and applies it on each vector via `mapslices`.
5+
You can also create a pipeline of `Transform` via `TransformChain`. For example, `LowRankTransform(rand(10,5))∘ScaleTransform(2.0)`.
66

7-
One apply a transformation on a matrix or a vector via `KernelFunctions.apply(t::Transform,v::AbstractVecOrMat)`
7+
A transformation can be applied to a matrix or a vector via `KernelFunctions.apply(t::Transform, v::AbstractVecOrMat)`
88

99
Check the list on the [API page](@ref Transforms)

0 commit comments

Comments
 (0)