Skip to content

Commit 65d7088

Browse files
willtebbutttheogf
andauthored
Tweak some docs a bit (#311)
* Tweak some docs a bit * Update docs/src/index.md Co-authored-by: Théo Galy-Fajou <[email protected]> * Update docs/src/userguide.md Co-authored-by: Théo Galy-Fajou <[email protected]> * Update docs/src/userguide.md Co-authored-by: Théo Galy-Fajou <[email protected]>
1 parent efca34f commit 65d7088

File tree

5 files changed

+37
-20
lines changed

5 files changed

+37
-20
lines changed

docs/make.jl

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,6 @@ makedocs(;
2929
"kernels.md",
3030
"transform.md",
3131
"metrics.md",
32-
"theory.md",
3332
"create_kernel.md",
3433
"API" => "api.md",
3534
"Examples" => "example.md",

docs/src/index.md

Lines changed: 5 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,12 @@
11
# KernelFunctions.jl
22

3-
Model-agnostic kernel functions compatible with automatic differentiation
4-
5-
**KernelFunctions.jl** is a general purpose kernel package.
3+
**KernelFunctions.jl** is a general purpose [kernel](https://en.wikipedia.org/wiki/Positive-definite_kernel) package.
64
It aims at providing a flexible framework for creating kernels and manipulating them.
7-
The main goals of this package compared to its predecessors/concurrents in [MLKernels.jl](https://github.com/trthatcher/MLKernels.jl), [Stheno.jl](https://github.com/willtebbutt/Stheno.jl), [GaussianProcesses.jl](https://github.com/STOR-i/GaussianProcesses.jl) and [AugmentedGaussianProcesses.jl](https://github.com/theogf/AugmentedGaussianProcesses.jl) are:
8-
- **Automatic Differentation** compatibility: all kernel functions should be differentiable via packages like [ForwardDiff.jl](https://github.com/JuliaDiff/ForwardDiff.jl) or [Zygote.jl](https://github.com/FluxML/Zygote.jl)
5+
The main goals of this package are:
96
- **Flexibility**: operations between kernels should be fluid and easy without breaking.
107
- **Plug-and-play**: including the kernels before/after other steps should be straightforward.
8+
- **Automatic Differentation** compatibility: all kernel functions which _ought_ to be differentiable using AD packages like [ForwardDiff.jl](https://github.com/JuliaDiff/ForwardDiff.jl) or [Zygote.jl](https://github.com/FluxML/Zygote.jl) _should_ be.
119

12-
The methodology of how kernels are computed is quite simple and is done in three phases :
13-
- A `Transform` object is applied sample-wise on every sample
14-
- The pairwise matrix is computed using [Distances.jl](https://github.com/JuliaStats/Distances.jl) by using a `Metric` proper to each kernel
15-
- The `Kernel` function is applied element-wise on the pairwise matrix
10+
This package builds on of lots of excellent existing work in packages such as [MLKernels.jl](https://github.com/trthatcher/MLKernels.jl), [Stheno.jl](https://github.com/willtebbutt/Stheno.jl), [GaussianProcesses.jl](https://github.com/STOR-i/GaussianProcesses.jl), and [AugmentedGaussianProcesses.jl](https://github.com/theogf/AugmentedGaussianProcesses.jl).
1611

17-
For a quick introduction on how to use it go to [User guide](@ref)
12+
See the [User guide](@ref) for a brief introduction.

docs/src/theory.md

Lines changed: 0 additions & 3 deletions
This file was deleted.

docs/src/userguide.md

Lines changed: 22 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -73,18 +73,37 @@ kernelmatrix(k, ColVecs(X)) # returns a 5×5 matrix -- each column of X treated
7373
```
7474
This is the mechanism used throughout KernelFunctions.jl to handle multi-dimensional inputs.
7575

76-
You can also utilise the `obsdim` keyword argument if you prefer:
76+
You can utilise the `obsdim` keyword argument if you prefer:
7777
```julia
7878
kernelmatrix(k, X; obsdim=1) # same as RowVecs(X)
7979
kernelmatrix(k, X; obsdim=2) # same as ColVecs(X)
8080
```
8181
This is similar to the convention used in [Distances.jl](https://github.com/JuliaStats/Distances.jl).
8282

83-
See [Input Types](@ref) for a more thorough discussion of these two approaches.
83+
### So what type should I use to represent a collection of inputs?
84+
The central assumption made by KernelFunctions.jl is that all collections of `N` inputs are represented by `AbstractVector`s of length `N`.
85+
Abstraction is then used to ensure that efficiency is retained, `ColVecs` and `RowVecs`
86+
being the most obvious examples of this.
8487

88+
Concretely:
89+
1. For `Real`-valued inputs (scalars), a `Vector{<:Real}` is fine.
90+
1. For vector-valued inputs, consider a `ColVecs` or `RowVecs`.
91+
1. For a new input type, simply represent collections of inputs of this type as an `AbstractVector`.
8592

93+
See [Input Types](@ref) and [Design](@ref) for a more thorough discussion of the
94+
considerations made when this design was adopted.
8695

87-
We also support specific kernel matrix outputs:
96+
The `obsdim` kwarg mentioned above is a special case for vector-valued inputs stored in a
97+
matrix.
98+
It is implemented as a lightweight wrapper that constructs either a `RowVecs` or `ColVecs`
99+
from your inputs, and passes this on.
100+
101+
102+
103+
### Output Types
104+
105+
In addition to plain `Matrix`-like output, KernelFunctions.jl supports specific output
106+
types:
88107
- For a positive-definite matrix object of type `PDMat` from [`PDMats.jl`](https://github.com/JuliaStats/PDMats.jl), you can call the following:
89108
```julia
90109
using PDMats

src/matrix/kernelmatrix.jl

Lines changed: 10 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -24,9 +24,16 @@ kernelmatrix!
2424

2525
"""
2626
kernelmatrix(κ::Kernel, x::AbstractVector)
27+
28+
Compute the kernel `κ` for each pair of inputs in `x`.
29+
Returns a matrix of size `(length(x), length(x))` satisfying
30+
`kernelmatrix(κ, x)[p, q] == κ(x[p], x[q])`.
31+
2732
kernelmatrix(κ::Kernel, x::AbstractVector, y::AbstractVector)
2833
29-
Calculate the kernel matrix of `x` (and `y`) with respect to kernel `κ`.
34+
Compute the kernel `κ` for each pair of inputs in `x` and `y`.
35+
Returns a matrix of size `(length(x), length(y))` satisfying
36+
`kernelmatrix(κ, x, y)[p, q] == κ(x[p], y[q])`.
3037
3138
kernelmatrix(κ::Kernel, X::AbstractMatrix; obsdim::Int=2)
3239
kernelmatrix(κ::Kernel, X::AbstractMatrix, Y::AbstractMatrix; obsdim::Int=2)
@@ -65,11 +72,11 @@ kernelmatrix_diag!
6572
"""
6673
kernelmatrix_diag(κ::Kernel, x::AbstractVector)
6774
68-
Calculate the diagonal matrix of `x` with respect to kernel `κ`.
75+
Compute the diagonal of `kernelmatrix(κ, x)` efficiently.
6976
7077
kernelmatrix_diag(κ::Kernel, x::AbstractVector, y::AbstractVector)
7178
72-
Calculate the diagonal of `kernelmatrix(κ, x, y)` efficiently.
79+
Compute the diagonal of `kernelmatrix(κ, x, y)` efficiently.
7380
Requires that `x` and `y` are the same length.
7481
7582
kernelmatrix_diag(κ::Kernel, X::AbstractMatrix; obsdim::Int=2)

0 commit comments

Comments
 (0)