You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Note that `Kernel` do not use `Distances.jl` and can therefore be a bit slower.
31
+
Note that `BaseKernel` do not use `Distances.jl` and can therefore be a bit slower.
32
32
33
33
### Additional Options
34
34
35
35
Finally there are additional functions you can define to bring in more features:
36
-
-Define the trainable parameters of your kernel with `KernelFunctions.trainable(k)` which should return a `Tuple` of your parameters.
37
-
This parameters will be then passed to `Flux.params` function
38
-
-`KernelFunctions.iskroncompatible(k)`, if your kernel factorizes in the dimensions. You can declare your kernel as `iskroncompatible(k) = true`
39
-
-`KernelFunctions.dim`: by default the dimension of the inputs will only be checked for vectors of `AbstractVector{<:Real}`.
40
-
If you want to check the dimensions of your inputs, dispatch the `dim` function on your kernel. Note that `0` is the default.
41
-
- You can also redefine the `kernelmatrix(k, x, y)` functions for your kernel to eventually optimize the computations of your kernel.
42
-
-`Base.show(io::IO, k::Kernel)`, if you want to specialize the printing of your kernel
36
+
-`KernelFunctions.trainable(k::MyKernel)`: it defines the trainable parameters of your kernel, it should return a `Tuple` of your parameters.
37
+
These parameters will be passed to the `Flux.params` function. For some examples see the `trainable.jl` file in `src/`
38
+
-`KernelFunctions.iskroncompatible(k::MyKernel)`: if your kernel factorizes in dimensions, you can declare your kernel as `iskroncompatible(k) = true` to use Kronecker methods.
39
+
-`KernelFunctions.dim(x::MyDataType)`: by default the dimension of the inputs will only be checked for vectors of type `AbstractVector{<:Real}`. If you want to check the dimensionality of your inputs, dispatch the `dim` function on your datatype. Note that `0` is the default.
40
+
- You can also directly overload `KernelFunctions.validate_inputs(x::MyDataType, y::MyDataType)` if you want to run special checks for your input types.
41
+
-`kernelmatrix(k::MyKernel, ...)`: you can redefine the diverse `kernelmatrix` functions to eventually optimize the computations.
42
+
-`Base.print(io::IO, k::MyKernel)`: if you want to specialize the printing of your kernel
0 commit comments