Skip to content

Commit 5d40a23

Browse files
committed
Merge branch 'master-dev'
2 parents 3f3a800 + 8af9ac5 commit 5d40a23

33 files changed

+420
-190
lines changed

Project.toml

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,10 +10,11 @@ SpecialFunctions = "276daf66-3868-5448-9aa4-cd146d93841b"
1010
StatsFuns = "4c63d2b9-4356-54db-8cca-17b64c39e42c"
1111

1212
[compat]
13-
julia = "1.0"
13+
Distances = "0.8.2"
1414
PDMats = "0.9.9"
1515
SpecialFunctions = "0.7.2"
16-
Distances = "0.8.2"
16+
StatsFuns = "0.8"
17+
julia = "1.0"
1718

1819
[extras]
1920
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"

README.md

Lines changed: 7 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -34,9 +34,13 @@ The aim is to make the API as model-agnostic as possible while still being user-
3434
<img src="docs/src/assets/heatmap_combination.png" width=400px>
3535
</p>
3636

37-
## Objectives (by priority)
38-
- AD Compatibility (Zygote, ForwardDiff)
39-
- Toeplitz Matrices
37+
## Packages goals (by priority)
38+
- Ensure AD Compatibility (Zygote, ForwardDiff)
39+
- Toeplitz Matrices compatibility
4040
- BLAS backend
4141

4242
Directly inspired by the [MLKernels](https://github.com/trthatcher/MLKernels.jl) package.
43+
44+
## Issues/Contributing
45+
46+
If you notice a problem or would like to contribute by adding more kernel functions or features please [submit an issue](https://github.com/theogf/KernelFunctions.jl/issues).

docs/make.jl

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,8 @@ makedocs(
66
format = Documenter.HTML(),
77
modules = [KernelFunctions],
88
pages = ["Home"=>"index.md",
9-
"User Guide" => "userguide.md",
9+
"User Guide" => "userguide.md",
10+
"Examples"=>"example.md",
1011
"Kernel Functions"=>"kernels.md",
1112
"Transform"=>"transform.md",
1213
"Metrics"=>"metrics.md",

docs/src/api.md

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ KernelFunctions
1818

1919
```@docs
2020
SqExponentialKernel
21-
Exponential
21+
ExponentialKernel
2222
GammaExponentialKernel
2323
ExponentiatedKernel
2424
MaternKernel
@@ -28,6 +28,9 @@ LinearKernel
2828
PolynomialKernel
2929
RationalQuadraticKernel
3030
GammaRationalQuadraticKernel
31+
ZeroKernel
32+
ConstantKernel
33+
WhiteKernel
3134
```
3235

3336
## Kernel Combinations
@@ -45,6 +48,7 @@ IdentityTransform
4548
ScaleTransform
4649
LowRankTransform
4750
FunctionTransform
51+
ChainTransform
4852
```
4953

5054
## Functions
@@ -54,6 +58,7 @@ kernelmatrix
5458
kernelmatrix!
5559
kerneldiagmatrix
5660
kerneldiagmatrix!
61+
kernelpdmat
5762
transform
5863
```
5964

docs/src/example.md

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
# Examples (WIP)
2+
3+
Here are a few examples of known complex kernels and how to do them. Or how to use kernels in a certain context
4+
5+
## Kernel Ridge Regression
6+
7+
Make a simple example of kernel ridge regression
8+
9+
## Gaussian Process Regression
10+
11+
Make a simple example of gaussian process regression
12+
13+
## Deep Kernel Learning
14+
15+
Put a Flux neural net in front of the kernel
16+
cf. Wilson paper
17+
18+
## Kernel Selection
19+
20+
Create a large collection of kernels and optimize the weights
21+
cf AISTATS 2018 paper

docs/src/index.md

Lines changed: 13 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,4 +2,16 @@
22

33
Model agnostic kernel functions compatible with automatic differentiation
44

5-
*** In Construction ***
5+
**KernelFunctions.jl** is a general purpose kernel package.
6+
It aims at providing a flexible framework for creating kernels and manipulating them.
7+
The main goals of this package compared to its predecessors/concurrents in [MLKernels.jl](https://github.com/trthatcher/MLKernels.jl), [Stheno.jl](https://github.com/willtebbutt/Stheno.jl), [GaussianProcesses.jl](https://github.com/STOR-i/GaussianProcesses.jl) and [AugmentedGaussianProcesses.jl](https://github.com/theogf/AugmentedGaussianProcesses.jl) are:
8+
- **Automatic Differentation** compatibility: all kernel functions should be differentiable via packages like [ForwardDiff.jl](https://github.com/JuliaDiff/ForwardDiff.jl) or [Zygote.jl](https://github.com/FluxML/Zygote.jl)
9+
- **Flexibility**: operations between kernels should be fluid and easy without breaking.
10+
- **Plug-and-play**: including the kernels before/after other steps should be straightforward.
11+
12+
The methodology of how kernels are computed is quite simple and is done in three phases :
13+
- A `Transform` object is applied sample-wise on every sample
14+
- The pairwise matrix is computed using [Distances.jl](https://github.com/JuliaStats/Distances.jl) by using a `Metric` proper to each kernel
15+
- The `Kernel` function is applied element-wise on the pairwise matrix
16+
17+
For a quick introduction on how to use it go to [User guide](@ref)

docs/src/kernels.md

Lines changed: 75 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -1,34 +1,93 @@
11
```@meta
2-
CurrentModule = KernelFunctions
2+
CurrentModule = KernelFunctions
33
```
44

55
## Exponential Kernels
66

7-
```@docs
8-
ExponentialKernel
9-
SqExponentialKernel
10-
GammaExponentialKernel
7+
### Exponential Kernel
8+
9+
The [Exponential Kernel](@ref ExponentialKernel) is defined as
10+
```math
11+
k(x,x') = \exp\left(-|x-x'|\right)
12+
```
13+
14+
### Square Exponential Kernel
15+
16+
The [Square Exponential Kernel](@ref KernelFunctions.SqExponentialKernel) is defined as
17+
```math
18+
k(x,x') = \exp\left(-\|x-x'\|^2\right)
19+
```
20+
21+
### Gamma Exponential Kernel
22+
23+
```math
24+
k(x,x';\gamma) = \exp\left(-\|x-x'\|^{2\gamma}\right)
1125
```
1226

1327
## Matern Kernels
1428

15-
```@docs
16-
MaternKernel
17-
Matern32Kernel
18-
Matern52Kernel
29+
### Matern Kernel
30+
31+
```math
32+
k(x,x';\nu) = \frac{2^{1-\nu}}{\Gamma(\nu)}\left(\sqrt{2\nu}|x-x'|\right)K_\nu\left(\sqrt{2\nu}|x-x'|\right)
33+
```
34+
35+
### Matern 3/2 Kernel
36+
37+
```math
38+
k(x,x') = \left(1+\sqrt{3}|x-x'|\right)\exp\left(\sqrt{3}|x-x'|\right)
39+
```
40+
41+
### Matern 5/2 Kernel
42+
43+
```math
44+
k(x,x') = \left(1+\sqrt{5}|x-x'|+\frac{5}{2}\|x-x'\|^2\right)\exp\left(\sqrt{5}|x-x'|\right)
45+
```
46+
47+
## Rational Quadratic
48+
49+
### Rational Quadratic Kernel
50+
51+
```math
52+
k(x,x';\alpha) = \left(1+\frac{\|x-x'\|^2}{\alpha}\right)^{-\alpha}
53+
```
54+
55+
### Gamma Rational Quadratic Kernel
56+
57+
```math
58+
k(x,x';\alpha,\gamma) = \left(1+\frac{\|x-x'\|^{2\gamma}}{\alpha}\right)^{-\alpha}
1959
```
2060

2161
## Polynomial Kernels
2262

23-
```@docs
24-
LinearKernel
25-
PolynomialKernel
63+
### LinearKernel
64+
65+
```math
66+
k(x,x';c) = \langle x,x'\rangle + c
67+
```
68+
69+
### PolynomialKernel
70+
71+
```math
72+
k(x,x';c,d) = \left(\langle x,x'\rangle + c\right)^d
2673
```
2774

2875
## Constant Kernels
2976

30-
```@docs
31-
ConstantKernel
32-
WhiteKernel
33-
ZeroKernel
77+
### ConstantKernel
78+
79+
```math
80+
k(x,x';c) = c
81+
```
82+
83+
### WhiteKernel
84+
85+
```math
86+
k(x,x') = \delta(x-x')
87+
```
88+
89+
### ZeroKernel
90+
91+
```math
92+
k(x,x') = 0
3493
```

docs/src/theory.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
See [Wikipedia article](https://en.wikipedia.org/wiki/Positive-definite_kernel)

docs/src/transform.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,9 @@ You can also create a pipeline of `Transform` via `TransformChain`. For example
77
One apply a transformation on a matrix or a vector via `transform(t::Transform,v::AbstractVecOrMat)`
88

99
## Transforms :
10+
```@meta
11+
CurrentModule = KernelFunctions
12+
```
1013

1114
```@docs
1215
IdentityTransform

docs/src/userguide.md

Lines changed: 28 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1,28 @@
1-
# Building kernel and matrices easily!
1+
# User guide
2+
3+
## Kernel creation
4+
5+
To create a kernel chose one of the kernels proposed, see [Kernels](@ref), or create your own, see [Creating Kernels](@ref)
6+
For example to create a square exponential kernel
7+
```julia
8+
k = SqExponentialKernel()
9+
```
10+
All kernels can take as argument a `Transform` object (see [Transform](@ref)) which is directly going to act on the inputs before it's processes.
11+
But it's also possible to simply give a scalar or a vector if all you are interested in is to modify the lengthscale, respectively for all dimensions or independently for each dimension.
12+
13+
## Kernel matrix creation
14+
15+
Matrix are created via the `kernelmatrix` function or `kerneldiagmatrix`.
16+
An important argument to give is the dimensionality of the input `obsdim`. It tells if the matrix is of the type `# samples X # features` (`obsdim`=1) or `# features X # samples`(`obsdim`=2) (similarly to [Distances.jl](https://github.com/JuliaStats/Distances.jl))
17+
For example:
18+
```julia
19+
k = SqExponentialKernel()
20+
A = rand(10,5)
21+
kernelmatrix(k,A,obsdim=1) # Return a 10x10 matrix
22+
kernelmatrix(k,A,obsdim=2) # Return a 5x5 matrix
23+
```
24+
25+
## Kernel manipulation
26+
27+
One can create combinations of kernels via `KernelSum` and `KernelProduct` or using simple operators `+` and `*`.
28+
For

0 commit comments

Comments
 (0)