Skip to content

Commit a297008

Browse files
Add comprehensive reusage interface documentation
This documentation explains how to efficiently reuse optimization problems using reinit\! and caching interfaces, following the patterns established in LinearSolve.jl and NonlinearSolve.jl. Key features covered: - Basic reinit\! usage with parameter updates - Parameter sweeps and warm-starting techniques - Advanced caching with iterator interface - Performance benefits and best practices - Complete examples for production workflows 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <[email protected]>
1 parent 4ec420c commit a297008

File tree

2 files changed

+172
-0
lines changed

2 files changed

+172
-0
lines changed

docs/pages.jl

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,7 @@ pages = ["index.md",
77
"tutorials/linearandinteger.md",
88
"tutorials/minibatch.md",
99
"tutorials/remakecomposition.md",
10+
"tutorials/reusage_interface.md",
1011
"tutorials/symbolic.md"
1112
],
1213
"Examples" => [
Lines changed: 171 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,171 @@
1+
# Optimization Problem Reusage and Caching Interface
2+
3+
One of the key features for high-performance optimization workflows is the ability to reuse and modify existing optimization problems without full reconstruction. Optimization.jl provides several mechanisms for efficiently reusing optimization setups, particularly through the `reinit!` function and caching interfaces.
4+
5+
## Basic Reusage with `reinit!`
6+
7+
The `reinit!` function allows you to modify an existing optimization problem and reuse the solver setup. This is particularly useful for parameter sweeps, sensitivity analysis, and warm-starting optimization problems.
8+
9+
```julia
10+
using Optimization, OptimizationOptimJL
11+
12+
# Define the Rosenbrock function
13+
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
14+
15+
# Initial setup
16+
x0 = [0.0, 0.0]
17+
p = [1.0, 100.0]
18+
prob = OptimizationProblem(rosenbrock, x0, p)
19+
20+
# Solve the first optimization
21+
sol1 = solve(prob, BFGS())
22+
23+
# Reinitialize with new parameters without reconstructing the entire problem
24+
reinit!(prob, x0, p = [2.0, 100.0])
25+
sol2 = solve(prob, BFGS())
26+
27+
# Reinitialize with new initial conditions
28+
reinit!(prob, [1.0, 1.0], p = [1.0, 100.0])
29+
sol3 = solve(prob, BFGS())
30+
```
31+
32+
## Parameter Sweeps
33+
34+
The `reinit!` function is particularly powerful for parameter sweeps where you need to solve the same optimization problem with different parameter values:
35+
36+
```julia
37+
using Optimization, OptimizationOptimJL
38+
39+
# Define optimization problem
40+
f(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
41+
x0 = [0.0, 0.0]
42+
prob = OptimizationProblem(f, x0, [1.0, 100.0])
43+
44+
# Parameter sweep
45+
parameter_values = [50.0, 100.0, 150.0, 200.0]
46+
solutions = []
47+
48+
for p_val in parameter_values
49+
reinit!(prob, x0, p = [1.0, p_val])
50+
sol = solve(prob, BFGS())
51+
push!(solutions, sol)
52+
end
53+
54+
# Access results
55+
for (i, sol) in enumerate(solutions)
56+
println("Parameter $(parameter_values[i]): Minimum at $(sol.u)")
57+
end
58+
```
59+
60+
## Warm-Starting Optimization
61+
62+
You can use previous solutions as starting points for new optimizations, which can significantly improve convergence:
63+
64+
```julia
65+
using Optimization, OptimizationOptimJL
66+
67+
# Define a more complex optimization problem
68+
complex_objective(x, p) = sum((x[i] - p[i])^2 for i in 1:length(x)) +
69+
sum(sin(x[i]) for i in 1:length(x))
70+
71+
# Initial problem
72+
n = 10
73+
x0 = zeros(n)
74+
p0 = ones(n)
75+
prob = OptimizationProblem(complex_objective, x0, p0)
76+
77+
# Solve initial problem
78+
sol1 = solve(prob, BFGS())
79+
80+
# Use previous solution as warm start for new parameter set
81+
new_params = 1.1 * ones(n) # Slightly different parameters
82+
reinit!(prob, sol1.u, p = new_params) # Warm start with previous solution
83+
sol2 = solve(prob, BFGS())
84+
85+
# Compare convergence
86+
println("Initial problem converged in $(sol1.iterations) iterations")
87+
println("Warm-started problem converged in $(sol2.iterations) iterations")
88+
```
89+
90+
## Advanced Caching with Iterator Interface
91+
92+
For more advanced use cases, you can use the solver's iterator interface to have fine-grained control over the optimization process:
93+
94+
```julia
95+
using Optimization, OptimizationOptimJL
96+
97+
# Setup optimization problem
98+
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
99+
prob = OptimizationProblem(rosenbrock, [0.0, 0.0], [1.0, 100.0])
100+
101+
# Initialize the solver (this creates the cache)
102+
cache = init(prob, BFGS())
103+
104+
# Perform optimization steps manually
105+
for i in 1:10
106+
step!(cache)
107+
println("Step $i: Current point = $(cache.u), Objective = $(cache.objective)")
108+
end
109+
110+
# Get final solution
111+
sol = solve!(cache)
112+
```
113+
114+
## Performance Benefits
115+
116+
The reusage interface provides several performance advantages:
117+
118+
1. **Reduced Memory Allocation**: Reusing problem structures avoids repeated memory allocations
119+
2. **Warm Starting**: Using previous solutions as initial guesses can reduce iterations needed
120+
3. **Solver State Preservation**: Internal solver states (like Hessian approximations) can be preserved
121+
4. **Batch Processing**: Efficient processing of multiple related optimization problems
122+
123+
## When to Use `reinit!` vs `remake`
124+
125+
- **Use `reinit!`** when:
126+
- You want to preserve solver internal state
127+
- Parameters or initial conditions change slightly
128+
- You're doing parameter sweeps or sensitivity analysis
129+
- Performance is critical
130+
131+
- **Use `remake`** when:
132+
- The problem structure changes significantly
133+
- You need a completely fresh start
134+
- Problem dimensions change
135+
136+
## Example: Complete Parameter Study
137+
138+
Here's a comprehensive example showing how to efficiently perform a parameter study:
139+
140+
```julia
141+
using Optimization, OptimizationOptimJL, Plots
142+
143+
# Define objective function
144+
objective(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2 + p[3] * x[1] * x[2]
145+
146+
# Setup base problem
147+
x0 = [0.0, 0.0]
148+
base_params = [1.0, 100.0, 0.0]
149+
prob = OptimizationProblem(objective, x0, base_params)
150+
151+
# Parameter study over the third parameter
152+
param3_range = -5.0:0.5:5.0
153+
results = Dict()
154+
155+
for p3 in param3_range
156+
# Efficiently update only the changing parameter
157+
new_params = [1.0, 100.0, p3]
158+
reinit!(prob, x0, p = new_params)
159+
160+
# Solve and store results
161+
sol = solve(prob, BFGS())
162+
results[p3] = (solution = sol.u, objective = sol.objective)
163+
end
164+
165+
# Analyze results
166+
objectives = [results[p3].objective for p3 in param3_range]
167+
plot(param3_range, objectives, xlabel="Parameter 3", ylabel="Objective Value",
168+
title="Parameter Study using reinit!")
169+
```
170+
171+
This reusage interface makes Optimization.jl highly efficient for production optimization workflows where the same problem structure is solved repeatedly with different parameters or initial conditions.

0 commit comments

Comments
 (0)