Skip to content

Commit db0702b

Browse files
haampieandreasnoack
authored andcommitted
Update documentation (#157)
* Remove section about benchmarking * WIP on documentation * Fix typo * Remove remainder of plot keyword * Chebyshev & power method * Uncomment deploy command * A -> B * Move things to separate folders and added the remaining methods in separate files * More docs * Add the iterator page * Update docs for stationary methods * ::Int consistency * Typo in IDR(s) docs * Add notes on underlying iterators * for _ = _ to for _ in _
1 parent 29ba93b commit db0702b

38 files changed

+1309
-1927
lines changed

docs/make.jl

Lines changed: 25 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -6,14 +6,32 @@ makedocs(
66
sitename = "IterativeSolvers.jl",
77
pages = [
88
"Home" => "index.md",
9-
"Manual" => "user_manual.md",
10-
"Library" => [
11-
"Public" => "library/public.md",
12-
"Internal" => "library/internal.md",
13-
],
9+
"Getting started" => "getting_started.md",
10+
"Preconditioning" => "preconditioning.md",
11+
"Linear systems" => [
12+
"Conjugate Gradients" => "linear_systems/cg.md",
13+
"Chebyshev iteration" => "linear_systems/chebyshev.md",
14+
"MINRES" => "linear_systems/minres.md",
15+
"BiCGStab(l)" => "linear_systems/bicgstabl.md",
16+
"IDR(s)" => "linear_systems/idrs.md",
17+
"Restarted GMRES" => "linear_systems/gmres.md",
18+
"LSMR" => "linear_systems/lsmr.md",
19+
"LSQR" => "linear_systems/lsqr.md",
20+
"Stationary methods" => "linear_systems/stationary.md"
21+
],
22+
"Eigenproblems" => [
23+
"Power method" => "eigenproblems/power_method.md",
24+
],
25+
"SVDL" => "svd/svdl.md",
26+
"Randomized algorithms" => "randomized.md",
27+
"The iterator approach" => "iterators.md",
28+
# "Additional resources" => [
29+
# # "Public" => "library/public.md",
30+
# # "Internal" => "library/internal.md",
31+
# ],
1432
"About" => [
15-
"Contributing" => "about/CONTRIBUTING.md",
16-
"License" => "about/license.md",
33+
"Contributing" => "about/CONTRIBUTING.md",
34+
"License" => "about/license.md",
1735
]
1836
]
1937
)

docs/src/about/CONTRIBUTING.md

Lines changed: 1 addition & 83 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
# Contributing
22

33
Contributions are always welcome, as are feature requests and suggestions. Feel
4-
free to open [issues](https://help.github.com/articles/creating-an-issue/) and [pull requests](https://help.github.com/articles/creating-a-pull-request/) at any time. If you aren't familiar with git or Github please start [now](https://help.github.com/articles/good-resources-for-learning-git-and-github/).
4+
free to open an [issue](https://github.com/JuliaMath/IterativeSolvers.jl/issues) or [pull request](https://github.com/JuliaMath/IterativeSolvers.jl/pulls) at any time.
55

66
It is important to note that almost every method in the package has documentation,
77
to know what it does simply use `?<method>` in the terminal.
@@ -79,85 +79,3 @@ A more detailed explanation of all the functions is in both the public and inter
7979
documentation of `ConvergenceHistory`.
8080

8181
The most rich example of the usage of `ConvergenceHistory` is in `svdl`.
82-
83-
84-
## Adding benchmarks
85-
86-
The [Benchmarks](@ref) tab of the documentation is built automatically with Travis.
87-
Any benchmark added will be displayed automatically after a successful pull request.
88-
89-
The benchmark suite gets built doing a cross product between the available matrices
90-
and available methods, if there are `n` methods and `m` linear operators then `n*m` will be
91-
the upper limit of benchmarks to be made. Some methods are not compatible with certain
92-
matrices, to avoid generating unnecessary benchmarks each method and matrix has
93-
traits, linear operator traits are inspired from [MatrixDepot.jl](http://matrixdepotjl.readthedocs.io/en/latest/properties.html).
94-
95-
**Method traits**
96-
97-
* accessible : Method accesses the linear operator's fields.
98-
* inverse : `A`'s Inverse must exist.
99-
* symmetric : `A`'s must be symmetric.
100-
* pos-def : `A`'s must be definite.
101-
102-
**Linear Operator traits**
103-
104-
* accessible : Is accessible.
105-
* inverse : `A` is exist.
106-
* symmetric : `A` is symmetric.
107-
* pos-def : `A` is definite.
108-
* eigen : Part of the eigensystem of the matrix is explicitly known.
109-
* graph : An adjacency matrix of a graph.
110-
* ill-cond : The matrix is ill-conditioned for some parameter values.
111-
* random : The matrix has random entries.
112-
* regprob : The output is a test problem for Regularization Methods.
113-
* sparse : The matrix is sparse.
114-
115-
A benchmark between a method and a linear operator will be made if and only if
116-
the traits of the method is subset of the traits of the linear operator.
117-
118-
Benchmarks are stored in [Benchmarks.jl](https://github.com/JuliaLang/IterativeSolvers.jl/tree/master/benchmark/Benchmarks.jl).
119-
To add a method use `addEqMethod`.
120-
121-
```julia
122-
addEqMethod(methods, "jacobi", jacobi, ["inverse","accessible"])
123-
addEqMethod(methods, "gauss_seidel", gauss_seidel, ["inverse","accessible"])
124-
addEqMethod(methods, "sor", sor, ["inverse","accessible"])
125-
addEqMethod(methods, "ssor", ssor, ["inverse","accessible", "symmetric"])
126-
addEqMethod(methods, "cg", cg, ["inverse", "symmetric", "pos-def"])
127-
addEqMethod(methods, "gmres", gmres, ["inverse"])
128-
addEqMethod(methods, "lsqr", lsqr, ["inverse"])
129-
addEqMethod(methods, "chebyshev", chebyshev, ["inverse", "accessible"])
130-
```
131-
132-
Here `methods` is a dictionary, the second argument is the name to be displayed in
133-
the benchmarks, the third argument is the function and the fourth is the traits.
134-
Every function has a predetermined call in `buildCall` function.
135-
136-
To add an equation use `addEquation`.
137-
138-
```julia
139-
#Sparse matrix equations
140-
addEquation(
141-
equations, "Poisson", ["Sparse", "Poisson"],
142-
["sparse","inverse", "symmetric", "pos-def", "eigen", "accessible"],
143-
:(matrixdepot("poisson",4))
144-
)
145-
146-
#Function matrix equations
147-
addEquation(
148-
equations, "SOLtest", ["Function", "SOLtest"],
149-
["function","inverse"],
150-
:(buildSol(10)),
151-
10
152-
)
153-
```
154-
155-
Here `equations` is a dictionary, the second argument is the name to be displayed in
156-
the benchmarks, the third argument is the path inside the `BenchmarkGroup` type
157-
the fourth argument is the traits, the fifth is the matrix generator and
158-
the sixth is the size of the matrix. The size of the matrix has to be passed when
159-
it is impossible to deduce the dimension from the generator, in this case buildSol
160-
generates a function and not a matrix.
161-
162-
To add a custom benchmark use directly the `suite` variable which is the `BenchmarkGroup`
163-
of the package, to know more of this type check [BenchmarkTools.jl](https://github.com/JuliaCI/BenchmarkTools.jl).
Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
# [(Inverse) power method](@id PowerMethod)
2+
3+
Solves the eigenproblem $Ax = λx$ approximately where $A$ is a general linear map. By default converges towards the dominant eigenpair $(λ, x)$ such that $|λ|$ is largest. Shift-and-invert can be applied to target a specific eigenvalue near `shift` in the complex plane.
4+
5+
## Usage
6+
7+
```@docs
8+
powm
9+
powm!
10+
invpowm
11+
invpowm!
12+
```
13+
14+
## Implementation details
15+
Storage requirements are 3 vectors: the approximate eigenvector `x`, the residual vector `r` and a temporary. The residual norm lags behind one iteration, as it is computed when $Ax$ is performed. Therefore the final resdiual norm is even smaller.

docs/src/getting_started.md

Lines changed: 126 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,126 @@
1+
# Getting started
2+
3+
## Installation
4+
5+
The package can be installed via Julia's package manager.
6+
7+
```julia
8+
julia> Pkg.add("IterativeSolvers")
9+
```
10+
11+
## Interface
12+
13+
Virtually all solvers have the common function declarations:
14+
15+
```julia
16+
solver(A, args...; kwargs...)
17+
solver!(x, A, args...; kwargs...)
18+
```
19+
20+
where `A` is a [linear operator](@ref matrixfree) and `x` an initial guess. The second declaration also updates `x` in-place.
21+
22+
### [Explicit matrices and the matrix-free approach](@id matrixfree)
23+
Rather than constructing an explicit matrix `A` of the type `Matrix` or `SparseMatrixCSC`, it is also possible to pass a general linear operator that performs matrix operations implicitly. This is called the **matrix-free** approach.
24+
25+
For matrix-free types of `A` the following interface is expected to be defined:
26+
27+
- `A*v` computes the matrix-vector product on a `v::AbstractVector`;
28+
- `A_mul_B!(y, A, v)` computes the matrix-vector product on a `v::AbstractVector` in-place;
29+
- `eltype(A)` returns the element type implicit in the equivalent matrix representation of `A`;
30+
- `size(A, d)` returns the nominal dimensions along the `d`th axis in the equivalent matrix representation of `A`.
31+
32+
!!! tip "Matrix-free with LinearMaps.jl"
33+
We strongly recommend [LinearMaps.jl](https://github.com/Jutho/LinearMaps.jl) for matrix-free linear operators, as it implements the above methods already for you; you just have to write the action of the linear map.
34+
35+
36+
### Additional arguments
37+
38+
Keyword names will vary depending on the method, however some of them will always have the same spelling:
39+
40+
- `tol`: (relative) stopping tolerance of the method;
41+
- `verbose`: print information during the iterations;
42+
- `maxiter`: maximum number of allowed iterations;
43+
- `Pl` and `Pr`: left and right preconditioner. See [Preconditioning](@ref Preconditioning);
44+
- `log::Bool = false`: output an extra element of type `ConvergenceHistory` containing the convergence history.
45+
46+
### `log` keyword
47+
48+
Most solvers contain the `log` keyword. This is to be used when obtaining
49+
more information is required, to use it place the set `log` to `true`.
50+
51+
```julia
52+
x, ch = cg(Master, rand(10, 10), rand(10) log=true)
53+
svd, L, ch = svdl(Master, rand(100, 100), log=true)
54+
```
55+
56+
The function will now return one more parameter of type `ConvergenceHistory`.
57+
58+
## ConvergenceHistory
59+
60+
A [`ConvergenceHistory`](@ref) instance stores information of a solver.
61+
62+
Number of iterations.
63+
64+
```julia
65+
ch.iters
66+
```
67+
68+
Convergence status.
69+
70+
```julia
71+
ch.isconverged
72+
```
73+
74+
Stopping tolerances. (A `Symbol` key is needed to access)
75+
76+
```julia
77+
ch[:tol]
78+
```
79+
80+
Maximum number of iterations per restart. (Only on restarted methods)
81+
82+
```julia
83+
nrests(ch)
84+
```
85+
86+
Number of matrix-vectors and matrix-transposed-vector products.
87+
88+
```julia
89+
nprods(ch)
90+
```
91+
92+
Data stored on each iteration, accessed information can be either a vector
93+
or matrix. This data can be a lot of things, most commonly residual.
94+
(A `Symbol` key is needed to access)
95+
96+
```julia
97+
ch[:resnorm] #Vector or Matrix
98+
ch[:resnorm, x] #Vector or Matrix element
99+
ch[:resnorm, x, y] #Matrix element
100+
```
101+
102+
```@docs
103+
ConvergenceHistory
104+
```
105+
106+
### Plotting
107+
108+
`ConvergeHistory` provides a recipe to use with the package [Plots.jl](https://github.com/tbreloff/Plots.jl), this makes it really easy to
109+
plot on different plot backends. There are two recipes provided:
110+
111+
One for the whole `ConvergenceHistory`.
112+
113+
```julia
114+
plot(ch)
115+
```
116+
117+
The other one to plot data binded to a key.
118+
119+
```julia
120+
_, ch = gmres(rand(10,10), rand(10), maxiter = 100, log=true)
121+
plot(ch, :resnorm, sep = :blue)
122+
```
123+
124+
*Plot additional keywords*
125+
126+
`sep::Symbol = :white`: color of the line separator in restarted methods.

docs/src/index.md

Lines changed: 20 additions & 74 deletions
Original file line numberDiff line numberDiff line change
@@ -1,91 +1,37 @@
11
# IterativeSolvers.jl
22

3-
IterativeSolvers.jl is a Julia package that provides iterative algorithms for
4-
solving linear systems, eigensystems, and singular value problems. The purpose
5-
of this package is to provide efficient Julia implementations for iterative
6-
methods. The package aims to accept a wide variety of input types and that's
7-
why most arguments don't specify a specific type, however this is still in
8-
progress.
3+
IterativeSolvers.jl is a Julia package that provides efficient iterative algorithms for solving large linear systems, eigenproblems, and singular value problems. Most of the methods can be used *matrix-free*.
94

10-
For bug reports, feature requests and questions please submit an issue.
11-
If you're interested in contributing, please see the [Contributing](@ref) guide.
5+
For bug reports, feature requests and questions please submit an issue. If you're interested in contributing, please see the [Contributing](@ref) guide.
126

137
For more information on future methods have a look at the package [roadmap](https://github.com/JuliaLang/IterativeSolvers.jl/issues/1) on deterministic methods, for randomized algorithms check [here](https://github.com/JuliaLang/IterativeSolvers.jl/issues/33).
148

15-
## Linear Solvers
9+
## What method should I use for linear systems?
1610

17-
**Stationary methods**
11+
When solving linear systems $Ax = b$ for a square matrix $A$ there are quite some options. The typical choices are listed below:
1812

19-
* Jacobi
20-
* Gauss-Seidel
21-
* Successive over relaxation
22-
* Symmetric successive over relaxation
13+
| Method | When to use it |
14+
|---------------------|--------------------------------------------------------------------------|
15+
| [Conjugate Gradients](@ref CG) | Best choice for **symmetric**, **positive-definite** matrices |
16+
| [MINRES](@ref MINRES) | For **symmetric**, **indefinite** matrices |
17+
| [GMRES](@ref GMRES) | For **nonsymmetric** matrices when a good [preconditioner](@ref Preconditioning) is available |
18+
| [IDR(s)](@ref IDRs) | For **nonsymmetric**, **strongly indefinite** problems without a good preconditioner |
19+
| [BiCGStab(l)](@ref BiCGStabl) | Otherwise for **nonsymmetric** problems |
2320

24-
**Non stationary methods**
21+
We also offer [Chebyshev iteration](@ref Chebyshev) as an alternative to Conjugate Gradients when bounds on the spectrum are known.
2522

26-
* IDRS
27-
* LSMR
28-
* LSQR
29-
* Conjugate gradients (CG)
30-
* Chebyshev iteration
31-
* Generalized minimal residual method (with restarts) (GMRES)
23+
Stationary methods like [Jacobi](@ref), [Gauss-Seidel](@ref), [SOR](@ref) and [SSOR](@ref) can be used as smoothers to reduce high-frequency components in the error in just a few iterations.
3224

33-
## Eigen Solvers
25+
When solving **least-squares** problems we currently offer just [LSMR](@ref LSMR) and [LSQR](@ref LSQR).
3426

35-
*Simple eigenpair iterations*
27+
## Eigenproblems and SVD
3628

37-
* Power iteration
38-
* Inverse power iteration
29+
For the Singular Value Decomposition we offer [SVDL](@ref SVDL), which is the Golub-Kahan-Lanczos procedure.
3930

40-
**Hermitian**
31+
For eigenvalue problems we have at this point just the [Power Method](@ref PowerMethod) and some convenience wrappers to do shift-and-invert.
4132

42-
*Lanczos*
33+
## Randomized algorithms
4334

44-
* Simple Lanczos
35+
[Randomized algorithms](@ref Randomized) have gotten some traction lately. Some of the methods mentioned in [^Halko2011] have been implemented as well, although their quality is generally poor. Also note that many classical methods such as the subspace iteration, BiCG and recent methods like IDR(s) are also "randomized" in some sense.
4536

46-
## Singular Value Decomposition
47-
48-
* Golub-Kahan-Lanczos
49-
50-
## Randomized
51-
52-
* Condition number estimate
53-
* Extremal eigenvalue estimates
54-
* Norm estimate
55-
* Randomized singular value decomposition
56-
57-
58-
59-
## Documentation Outline
60-
61-
### Manual
62-
63-
```@contents
64-
Pages = [
65-
"user_manual.md",
66-
]
67-
Depth = 2
68-
```
69-
70-
### Library
71-
72-
```@contents
73-
Pages = ["library/public.md", "library/internal.md"]
74-
Depth = 2
75-
```
76-
77-
### [Index](@id main-index)
78-
79-
### Functions
80-
81-
```@index
82-
Pages = ["library/public.md", "library/internals.md"]
83-
Order = [:function]
84-
```
85-
86-
### Types
87-
88-
```@index
89-
Pages = ["library/public.md", "library/internals.md"]
90-
Order = [:type]
91-
```
37+
[^Halko2011]: Halko, Nathan, Per-Gunnar Martinsson, and Joel A. Tropp. "Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions." SIAM review 53.2 (2011): 217-288.

0 commit comments

Comments
 (0)