diff --git a/README.md b/README.md index 05a15e0..3f34c34 100644 --- a/README.md +++ b/README.md @@ -18,7 +18,7 @@ NNTree(data, metric; leafsize, reorder) ``` * `data`: This parameter represents the points to build up the tree from. It can either be a matrix of size `nd × np` with the points to insert in the tree where `nd` is the dimensionality of the points, `np` is the number of points or it can be a `Vector{V}` where `V` is itself a subtype of an `AbstractVector` and such that `eltype(V)` and `length(V)` is defined. -* `metric`: The metric to use, defaults to `Euclidean`. This is one of the `Metric` types defined in the `Distances.jl` packages. It is possible to define your own metrics by simply creating new types that are subtypes of `Metric`. +* `metric`: The metric to use, defaults to `Euclidean`. This is one of the `Metric` types defined in the `Distances.jl` packages. It is possible to define your own metrics by creating new types that are subtypes of `Metric`. (see below) * `leafsize` (keyword argument): Determines at what number of points to stop splitting the tree further. There is a trade-off between traversing the tree and having to evaluate the metric function for increasing number of points. * `reorder` (keyword argument): While building the tree this will put points close in distance close in memory since this helps with cache locality. In this case, a copy of the original data will be made so that the original data is left unmodified. This can have a significant impact on performance and is by default set to `true`. @@ -42,6 +42,20 @@ balltree = BallTree(data, Minkowski(3.5); reorder = false) brutetree = BruteTree(data) ``` +How to write your own metric: +```jl +using Distances, NearestNeighbors + +# Declare the metric +struct MyMetric <: Metric end + +# and define the logic +(::MyMetric)(x,y) = Euclidean()(x,y) + +# Now its useable! +BruteTree(randn(2, 100), MyMetric()) +``` + ## k Nearest Neighbor (kNN) searches A kNN search finds the `k` nearest neighbors to given point(s).