You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+92-4Lines changed: 92 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@
6
6
</div>
7
7
</div>
8
8
9
-
Learning to optimize (LearningToOptimize) package that provides basic functionalities to help fit proxy models for optimization.
9
+
Learning to optimize (LearningToOptimize) package that provides basic functionalities to help fit proxy models for parametric optimization problems.
10
10
11
11
Have a look at our sister [HugginFace Organization](https://huggingface.co/LearningToOptimize), for datasets, pre-trained models and benchmarks.
12
12
@@ -19,6 +19,34 @@ Have a look at our sister [HugginFace Organization](https://huggingface.co/Learn
19
19
20
20

21
21
22
+
# Background
23
+
24
+
Parametric optimization problems arise in scenarios where certain elements (e.g., coefficients, constraints) may vary according to problem parameters. A general form of a parameterized convex optimization problem is
25
+
26
+
$$
27
+
\begin{aligned}
28
+
&\min_{x} \quad f(x; \theta) \\
29
+
&\text{subject to} \quad g_i(x; \theta) \leq 0, \quad i = 1,\dots, m \\
30
+
&\quad\quad\quad\quad A(\theta)x = b(\theta)
31
+
\end{aligned}
32
+
$$
33
+
34
+
where $ \theta $ is the parameter.
35
+
36
+
**Learning to Optimize (L2O)** is an emerging paradigm where machine learning models *learn* to solve optimization problems efficiently. This approach is also known as using **optimization proxies** or **amortized optimization**.
37
+
38
+
In more technical terms, **amortized optimization** seeks to learn a function \\( f_\theta(x) \\) that maps problem parameters \\( x \\) to solutions \\( y \\) that (approximately) minimize a given objective function subject to constraints. Modern methods leverage techniques like **differentiable optimization layers**, **input-convex neural networks**, or constraint-enforcing architectures (e.g., [DC3](https://openreview.net/pdf?id=0Ow8_1kM5Z)) to ensure that the learned proxy solutions are both feasible and performant. By coupling the solver and the model in an **end-to-end** pipeline, these approaches let the training objective directly reflect downstream metrics, improving speed and reliability.
39
+
40
+
Recent advances also focus on **trustworthy** or **certifiable** proxies, where constraint satisfaction or performance bounds are guaranteed. This is crucial in domains like energy systems or manufacturing, where infeasible solutions can have large penalties or safety concerns. Overall, learning-based optimization frameworks aim to combine the advantages of ML (data-driven generalization) with the rigor of mathematical programming (constraint handling and optimality).
41
+
42
+
For a broader overview, see the [SIAM News article on trustworthy optimization proxies](https://www.siam.org/publications/siam-news/articles/fusing-artificial-intelligence-and-optimization-with-trustworthy-optimization-proxies/), which highlights the growing synergy between AI and classical optimization.
43
+
44
+
# Installation
45
+
46
+
```julia
47
+
] add LearningToOptimize
48
+
```
49
+
22
50
## Generate Dataset
23
51
This package provides a basic way of generating a dataset of the solutions of an optimization problem by varying the values of the parameters in the problem and recording it.
24
52
@@ -62,7 +90,33 @@ Which creates the following CSV:
62
90
| 9 | 9.0 |
63
91
| 10 | 10.0|
64
92
65
-
ps.: For illustration purpose, I have represented the id's here as integers, but in reality they are generated as UUIDs.
93
+
ps.: For illustration purpose, I have represented the id's here as integers, but in reality they are generated as UUIDs.
94
+
95
+
To load the parameter values back:
96
+
97
+
```julia
98
+
problem_iterator =load("input_file.csv", CSVFile)
99
+
```
100
+
101
+
### Samplers
102
+
103
+
Instead of defining parameter instances manually, one may sample parameter values using pre-defined samplers - e.g. `scaled_distribution_sampler`, `box_sampler`- or define their own sampler. Samplers are functions that take a vector of parameter of type `MOI.Parameter` and return a matrix of parameter values.
104
+
105
+
The easiest way to go from problem definition, sampling parameter values and saving them is to use the `general_sampler` function:
This function is a general sampler that uses a set of samplers to sample the parameter space.
119
+
It loads the underlying model from a passed `file` that works with JuMP's `read_from_file` (ps.: currently only tested with `MathOptFormat`), samples the parameters and saves the sampled parameters to `save_file`.
Copy file name to clipboardExpand all lines: docs/src/index.md
+62-12Lines changed: 62 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -71,7 +71,33 @@ Which creates the following CSV:
71
71
| 9 | 9.0 |
72
72
| 10 | 10.0|
73
73
74
-
ps.: For illustration purpose, I have represented the id's here as integers, but in reality they are generated as UUIDs.
74
+
ps.: For illustration purpose, I have represented the id's here as integers, but in reality they are generated as UUIDs.
75
+
76
+
To load the parameter values back:
77
+
78
+
```julia
79
+
problem_iterator =load("input_file.csv", CSVFile)
80
+
```
81
+
82
+
### Samplers
83
+
84
+
Instead of defining parameter instances manually, one may sample parameter values using pre-defined samplers - e.g. `scaled_distribution_sampler`, `box_sampler`- or define their own sampler. Samplers are functions that take a vector of parameter of type `MOI.Parameter` and return a matrix of parameter values.
85
+
86
+
The easiest way to go from problem definition, sampling parameter values and saving them is to use the `general_sampler` function:
This function is a general sampler that uses a set of samplers to sample the parameter space.
100
+
It loads the underlying model from a passed `file` that works with JuMP's `read_from_file` (ps.: currently only tested with `MathOptFormat`), samples the parameters and saves the sampled parameters to `save_file`.
0 commit comments