Skip to content

Commit faec4ea

Browse files
committed
Docs improvements
1 parent 018913a commit faec4ea

File tree

1 file changed

+9
-9
lines changed

1 file changed

+9
-9
lines changed

docs/src/create_kernel.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ Here are a few ways depending on how complicated your kernel is :
88

99
If your kernel function is of the form `k(x, y) = f(d(x, y))` where `d(x, y)` is a `PreMetric`,
1010
you can construct your custom kernel by defining `kappa` and `metric` for your kernel.
11-
Here is for example how one can define the `SqExponentialKernel` again:
11+
Here is for example how one can define the `SqExponentialKernel` again :
1212

1313
```julia
1414
struct MyKernel <: KernelFunctions.SimpleKernel end
@@ -28,15 +28,15 @@ struct MyKernel <: KernelFunctions.Kernel end
2828
(::MyKernel)(x, y) = asin(dot(x, y) / sqrt((1 + sum(abs2, x)) * (1 + sum(abs2, y))))
2929
```
3030

31-
Note that `Kernel` do not use `Distances.jl` and can therefore be a bit slower.
31+
Note that `BaseKernel` do not use `Distances.jl` and can therefore be a bit slower.
3232

3333
### Additional Options
3434

3535
Finally there are additional functions you can define to bring in more features:
36-
- Define the trainable parameters of your kernel with `KernelFunctions.trainable(k)` which should return a `Tuple` of your parameters.
37-
This parameters will be then passed to `Flux.params` function
38-
- `KernelFunctions.iskroncompatible(k)`, if your kernel factorizes in the dimensions. You can declare your kernel as `iskroncompatible(k) = true`
39-
- `KernelFunctions.dim`: by default the dimension of the inputs will only be checked for vectors of `AbstractVector{<:Real}`.
40-
If you want to check the dimensions of your inputs, dispatch the `dim` function on your kernel. Note that `0` is the default.
41-
- You can also redefine the `kernelmatrix(k, x, y)` functions for your kernel to eventually optimize the computations of your kernel.
42-
- `Base.show(io::IO, k::Kernel)`, if you want to specialize the printing of your kernel
36+
- `KernelFunctions.trainable(k::MyKernel)`: it defines the trainable parameters of your kernel, it should return a `Tuple` of your parameters.
37+
These parameters will be passed to the `Flux.params` function. For some examples see the `trainable.jl` file in `src/`
38+
- `KernelFunctions.iskroncompatible(k::MyKernel)`: if your kernel factorizes in dimensions, you can declare your kernel as `iskroncompatible(k) = true` to use Kronecker methods.
39+
- `KernelFunctions.dim(x::MyDataType)`: by default the dimension of the inputs will only be checked for vectors of type `AbstractVector{<:Real}`. If you want to check the dimensionality of your inputs, dispatch the `dim` function on your datatype. Note that `0` is the default.
40+
- You can also directly overload `KernelFunctions.validate_inputs(x::MyDataType, y::MyDataType)` if you want to run special checks for your input types.
41+
- `kernelmatrix(k::MyKernel, ...)`: you can redefine the diverse `kernelmatrix` functions to eventually optimize the computations.
42+
- `Base.print(io::IO, k::MyKernel)`: if you want to specialize the printing of your kernel

0 commit comments

Comments
 (0)