Skip to content

Commit d6c320f

Browse files
committed
whitespace fixes
1 parent 3b289f5 commit d6c320f

File tree

2 files changed

+93
-89
lines changed

2 files changed

+93
-89
lines changed

docs/src/anatomy_of_an_implementation.md

Lines changed: 86 additions & 86 deletions
Original file line numberDiff line numberDiff line change
@@ -10,19 +10,19 @@ For a transformer, implementations ordinarily implement `transform` instead of
1010

1111
!!! important
1212

13-
The core implementations of `fit`, `predict`, etc,
14-
always have a *single* `data` argument, as in `fit(algorithm, data; verbosity=1)`.
15-
Calls like `fit(algorithm, X, y)` are provided as additional convenience methods.
13+
The core implementations of `fit`, `predict`, etc,
14+
always have a *single* `data` argument, as in `fit(algorithm, data; verbosity=1)`.
15+
Calls like `fit(algorithm, X, y)` are provided as additional convenience methods.
1616

1717
!!! note
1818

19-
If the `data` object consumed by `fit`, `predict`, or `transform` is not
20-
not a suitable table¹, array³, tuple of tables and arrays, or some
21-
other object implementing
22-
the MLUtils.jl `getobs`/`numobs` interface,
23-
then an implementation must: (i) suitably overload the trait
24-
[`LearnAPI.data_interface`](@ref); and/or (ii) overload [`obs`](@ref), as
25-
illustrated below under [Providing an advanced data interface](@ref).
19+
If the `data` object consumed by `fit`, `predict`, or `transform` is not
20+
not a suitable table¹, array³, tuple of tables and arrays, or some
21+
other object implementing
22+
the MLUtils.jl `getobs`/`numobs` interface,
23+
then an implementation must: (i) suitably overload the trait
24+
[`LearnAPI.data_interface`](@ref); and/or (ii) overload [`obs`](@ref), as
25+
illustrated below under [Providing an advanced data interface](@ref).
2626

2727
The first line below imports the lightweight package LearnAPI.jl whose methods we will be
2828
extending. The second imports libraries needed for the core algorithm.
@@ -39,7 +39,7 @@ Here's a new type whose instances specify ridge regression parameters:
3939

4040
```@example anatomy
4141
struct Ridge{T<:Real}
42-
lambda::T
42+
lambda::T
4343
end
4444
nothing # hide
4545
```
@@ -63,7 +63,7 @@ changed to `0.05`.
6363

6464
## Implementing `fit`
6565

66-
A ridge regressor requires two types of data for training: *input features* `X`, which
66+
A ridge regressor requires two types of data for training: input features `X`, which
6767
here we suppose are tabular¹, and a [target](@ref proxy) `y`, which we suppose is a
6868
vector.
6969

@@ -72,9 +72,9 @@ coefficients labelled by feature name for inspection after training:
7272

7373
```@example anatomy
7474
struct RidgeFitted{T,F}
75-
algorithm::Ridge
76-
coefficients::Vector{T}
77-
named_coefficients::F
75+
algorithm::Ridge
76+
coefficients::Vector{T}
77+
named_coefficients::F
7878
end
7979
nothing # hide
8080
```
@@ -87,25 +87,25 @@ The core implementation of `fit` looks like this:
8787
```@example anatomy
8888
function LearnAPI.fit(algorithm::Ridge, data; verbosity=1)
8989
90-
X, y = data
90+
X, y = data
9191
92-
# data preprocessing:
93-
table = Tables.columntable(X)
94-
names = Tables.columnnames(table) |> collect
95-
A = Tables.matrix(table, transpose=true)
92+
# data preprocessing:
93+
table = Tables.columntable(X)
94+
names = Tables.columnnames(table) |> collect
95+
A = Tables.matrix(table, transpose=true)
9696
97-
lambda = algorithm.lambda
97+
lambda = algorithm.lambda
9898
99-
# apply core algorithm:
100-
coefficients = (A*A' + algorithm.lambda*I)\(A*y) # vector
99+
# apply core algorithm:
100+
coefficients = (A*A' + algorithm.lambda*I)\(A*y) # vector
101101
102-
# determine named coefficients:
103-
named_coefficients = [names[j] => coefficients[j] for j in eachindex(names)]
102+
# determine named coefficients:
103+
named_coefficients = [names[j] => coefficients[j] for j in eachindex(names)]
104104
105-
# make some noise, if allowed:
106-
verbosity > 0 && @info "Coefficients: $named_coefficients"
105+
# make some noise, if allowed:
106+
verbosity > 0 && @info "Coefficients: $named_coefficients"
107107
108-
return RidgeFitted(algorithm, coefficients, named_coefficients)
108+
return RidgeFitted(algorithm, coefficients, named_coefficients)
109109
end
110110
```
111111

@@ -127,7 +127,7 @@ Here's the implementation for our ridge regressor:
127127

128128
```@example anatomy
129129
LearnAPI.predict(model::RidgeFitted, ::LiteralTarget, Xnew) =
130-
Tables.matrix(Xnew)*model.coefficients
130+
Tables.matrix(Xnew)*model.coefficients
131131
```
132132

133133
## Accessor functions
@@ -156,7 +156,7 @@ overload it to dump the named version of the coefficients:
156156

157157
```@example anatomy
158158
LearnAPI.minimize(model::RidgeFitted) =
159-
RidgeFitted(model.algorithm, model.coefficients, nothing)
159+
RidgeFitted(model.algorithm, model.coefficients, nothing)
160160
```
161161

162162
Crucially, we can still use `LearnAPI.minimize(model)` in place of `model` to make new
@@ -187,19 +187,19 @@ The macro can be used to specify multiple traits simultaneously:
187187

188188
```@example anatomy
189189
@trait(
190-
Ridge,
191-
constructor = Ridge,
192-
target = true,
193-
kinds_of_proxy=(LiteralTarget(),),
194-
descriptors = (:regression,),
195-
functions = (
196-
fit,
197-
minimize,
198-
predict,
199-
obs,
200-
LearnAPI.algorithm,
201-
LearnAPI.coefficients,
202-
)
190+
Ridge,
191+
constructor = Ridge,
192+
target = true,
193+
kinds_of_proxy=(LiteralTarget(),),
194+
descriptors = (:regression,),
195+
functions = (
196+
fit,
197+
minimize,
198+
predict,
199+
obs,
200+
LearnAPI.algorithm,
201+
LearnAPI.coefficients,
202+
)
203203
)
204204
nothing # hide
205205
```
@@ -230,10 +230,10 @@ enabling the kind of workflow previewed in [Sample workflow](@ref):
230230

231231
```@example anatomy
232232
LearnAPI.fit(algorithm::Ridge, X, y; kwargs...) =
233-
fit(algorithm, (X, y); kwargs...)
233+
fit(algorithm, (X, y); kwargs...)
234234
235235
LearnAPI.predict(model::RidgeFitted, Xnew) =
236-
predict(model, LiteralTarget(), Xnew)
236+
predict(model, LiteralTarget(), Xnew)
237237
```
238238

239239
## [Demonstration](@id workflow)
@@ -292,40 +292,40 @@ using LearnAPI
292292
using LinearAlgebra, Tables
293293
294294
struct Ridge{T<:Real}
295-
lambda::T
295+
lambda::T
296296
end
297297
298298
Ridge(; lambda=0.1) = Ridge(lambda)
299299
300300
struct RidgeFitted{T,F}
301-
algorithm::Ridge
302-
coefficients::Vector{T}
303-
named_coefficients::F
301+
algorithm::Ridge
302+
coefficients::Vector{T}
303+
named_coefficients::F
304304
end
305305
306306
LearnAPI.algorithm(model::RidgeFitted) = model.algorithm
307307
LearnAPI.coefficients(model::RidgeFitted) = model.named_coefficients
308308
LearnAPI.minimize(model::RidgeFitted) =
309-
RidgeFitted(model.algorithm, model.coefficients, nothing)
309+
RidgeFitted(model.algorithm, model.coefficients, nothing)
310310
311311
LearnAPI.fit(algorithm::Ridge, X, y; kwargs...) =
312-
fit(algorithm, (X, y); kwargs...)
312+
fit(algorithm, (X, y); kwargs...)
313313
LearnAPI.predict(model::RidgeFitted, Xnew) = predict(model, LiteralTarget(), Xnew)
314314
315315
@trait(
316-
Ridge,
317-
constructor = Ridge,
318-
target = true,
319-
kinds_of_proxy=(LiteralTarget(),),
320-
descriptors = (:regression,),
321-
functions = (
322-
fit,
323-
minimize,
324-
predict,
325-
obs,
326-
LearnAPI.algorithm,
327-
LearnAPI.coefficients,
328-
)
316+
Ridge,
317+
constructor = Ridge,
318+
target = true,
319+
kinds_of_proxy=(LiteralTarget(),),
320+
descriptors = (:regression,),
321+
functions = (
322+
fit,
323+
minimize,
324+
predict,
325+
obs,
326+
LearnAPI.algorithm,
327+
LearnAPI.coefficients,
328+
)
329329
)
330330
331331
n = 10 # number of observations
@@ -344,20 +344,20 @@ new type:
344344

345345
```@example anatomy2
346346
struct RidgeFitObs{T,M<:AbstractMatrix{T}}
347-
A::M # p x n
348-
names::Vector{Symbol} # features
349-
y::Vector{T} # target
347+
A::M # p x n
348+
names::Vector{Symbol} # features
349+
y::Vector{T} # target
350350
end
351351
```
352352

353353
Now we overload `obs` to carry out the data pre-processing previously in `fit`, like this:
354354

355355
```@example anatomy2
356356
function LearnAPI.obs(::Ridge, data)
357-
X, y = data
358-
table = Tables.columntable(X)
359-
names = Tables.columnnames(table) |> collect
360-
return RidgeFitObs(Tables.matrix(table)', names, y)
357+
X, y = data
358+
table = Tables.columntable(X)
359+
names = Tables.columnnames(table) |> collect
360+
return RidgeFitObs(Tables.matrix(table)', names, y)
361361
end
362362
```
363363

@@ -369,27 +369,27 @@ methods - one to handle "regular" input, and one to handle the pre-processed dat
369369
```@example anatomy2
370370
function LearnAPI.fit(algorithm::Ridge, observations::RidgeFitObs; verbosity=1)
371371
372-
lambda = algorithm.lambda
372+
lambda = algorithm.lambda
373373
374-
A = observations.A
375-
names = observations.names
376-
y = observations.y
374+
A = observations.A
375+
names = observations.names
376+
y = observations.y
377377
378-
# apply core algorithm:
379-
coefficients = (A*A' + algorithm.lambda*I)\(A*y) # 1 x p matrix
378+
# apply core algorithm:
379+
coefficients = (A*A' + algorithm.lambda*I)\(A*y) # 1 x p matrix
380380
381-
# determine named coefficients:
382-
named_coefficients = [names[j] => coefficients[j] for j in eachindex(names)]
381+
# determine named coefficients:
382+
named_coefficients = [names[j] => coefficients[j] for j in eachindex(names)]
383383
384-
# make some noise, if allowed:
385-
verbosity > 0 && @info "Coefficients: $named_coefficients"
384+
# make some noise, if allowed:
385+
verbosity > 0 && @info "Coefficients: $named_coefficients"
386386
387-
return RidgeFitted(algorithm, coefficients, named_coefficients)
387+
return RidgeFitted(algorithm, coefficients, named_coefficients)
388388
389389
end
390390
391391
LearnAPI.fit(algorithm::Ridge, data; kwargs...) =
392-
fit(algorithm, obs(algorithm, data); kwargs...)
392+
fit(algorithm, obs(algorithm, data); kwargs...)
393393
```
394394

395395
We provide an overloading of `LearnAPI.target` to handle the additional supported data
@@ -409,7 +409,7 @@ accessing individual observations*. It usually suffices to overload `Base.getind
409409

410410
```@example anatomy2
411411
Base.getindex(data::RidgeFitObs, I) =
412-
RidgeFitObs(data.A[:,I], data.names, y[I])
412+
RidgeFitObs(data.A[:,I], data.names, y[I])
413413
Base.length(data::RidgeFitObs, I) = length(data.y)
414414
```
415415

@@ -420,10 +420,10 @@ case:
420420
LearnAPI.obs(::RidgeFitted, Xnew) = Tables.matrix(Xnew)'
421421
422422
LearnAPI.predict(model::RidgeFitted, ::LiteralTarget, observations::AbstractMatrix) =
423-
observations'*model.coefficients
423+
observations'*model.coefficients
424424
425425
LearnAPI.predict(model::RidgeFitted, ::LiteralTarget, Xnew) =
426-
predict(model, LiteralTarget(), obs(model, Xnew))
426+
predict(model, LiteralTarget(), obs(model, Xnew))
427427
```
428428

429429
### Important notes:

docs/src/reference.md

Lines changed: 7 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -25,9 +25,13 @@ an example of data, the observations being the rows. Typically, data provided to
2525
LearnAPI.jl algorithms, will implement the
2626
[MLUtils.jl](https://juliaml.github.io/MLUtils.jl/stable) `getobs/numobs` interface for
2727
accessing individual observations, but implementations can opt out of this requirement;
28-
see [`obs`](@ref) and [`LearnAPI.data_interface`](@ref) for details. In the MLUtils.jl
29-
convention, observations in tables are the rows but observations in a matrix are the
30-
columns.
28+
see [`obs`](@ref) and [`LearnAPI.data_interface`](@ref) for details.
29+
30+
!!! note
31+
32+
In the MLUtils.jl
33+
convention, observations in tables are the rows but observations in a matrix are the
34+
columns.
3135

3236
### [Hyperparameters](@id hyperparameters)
3337

0 commit comments

Comments
 (0)