@@ -7,22 +7,31 @@ Solves for ``f(u)=0`` in the problem defined by `prob` using the algorithm
7
7
8
8
## Recommended Methods
9
9
10
- ` NewtonRaphson ` is a good choice for most problems. For large
11
- systems, it can make use of sparsity patterns for sparse automatic differentiation
12
- and sparse linear solving of very large systems. That said, as a classic Newton
13
- method, its stability region can be smaller than other methods. Meanwhile,
14
- ` SimpleNewtonRaphson ` is an implementation which is specialized for
15
- small equations. It is non-allocating on static arrays and thus really well-optimized
10
+ The default method ` FastShortcutNonlinearPolyalg ` is a good choice for most
11
+ problems. It is a polyalgorithm that attempts to use a fast algorithm
12
+ (Klement, Broyden) and if that fails it falls back to a more robust
13
+ algorithm (` NewtonRaphson ` ) before falling back the most robust varient of
14
+ ` TrustRegion ` . For basic problems this will be very fast, for harder problems
15
+ it will make sure to work.
16
+
17
+ If one is looking for more robustness then ` RobustMultiNewton ` is a good choice.
18
+ It attempts a set of the most robust methods in succession and only fails if
19
+ all of the methods fail to converge. Additionally, ` DynamicSS ` can be a good choice
20
+ for high stability.
21
+
22
+ As a balance, ` NewtonRaphson ` is a good choice for most problems that aren't too
23
+ difficult yet need high performance, and ` TrustRegion ` is a bit less performant
24
+ but more stable. If the problem is well-conditioned, ` Klement ` or ` Broyden `
25
+ may be faster, but highly dependent on the eigenvalues of the Jacobian being
26
+ sufficiently small.
27
+
28
+ ` NewtonRaphson ` and ` TrustRegion ` are designed for for large systems.
29
+ They can make use of sparsity patterns for sparse automatic differentiation
30
+ and sparse linear solving of very large systems. Meanwhile,
31
+ ` SimpleNewtonRaphson ` and ` SimpleTrustRegion ` are implementations which is specialized for
32
+ small equations. They are non-allocating on static arrays and thus really well-optimized
16
33
for small systems, thus usually outperforming the other methods when such types are
17
- used for ` u0 ` . ` DynamicSS ` can be a good choice for high stability.
18
-
19
- For a system which is very non-stiff (i.e., the condition number of the Jacobian
20
- is small, or the eigenvalues of the Jacobian are within a few orders of magnitude),
21
- then ` NLSolveJL ` 's ` :anderson ` can be a good choice.
22
-
23
- !!! note
24
-
25
- ` TrustRegion ` and ` SimpleTrustRegion ` are still in development.
34
+ used for ` u0 ` .
26
35
27
36
## Full List of Methods
28
37
@@ -46,6 +55,13 @@ features, but have a bit of overhead on very small problems.
46
55
improvements suggested in the [ paper] ( https://arxiv.org/abs/1201.5885 ) "Improvements to
47
56
the Levenberg-Marquardt algorithm for nonlinear least-squares minimization". Designed for
48
57
large-scale and numerically-difficult nonlinear systems.
58
+ - ` RobustMultiNewton() ` : A polyalgorithm that mixes highly robust methods (line searches and
59
+ trust regions) in order to be as robust as possible for difficult problems. If this method
60
+ fails to converge, then one can be pretty certain that most (all?) other choices would
61
+ likely fail.
62
+ - ` FastShortcutNonlinearPolyalg ` : The default method. A polyalgorithm that mixes fast methods
63
+ with fallbacks to robust methods to allow for solving easy problems quickly without sacrificing
64
+ robustnes on the hard problems.
49
65
50
66
### SimpleNonlinearSolve.jl
51
67
0 commit comments