Skip to content

Commit 066228e

Browse files
committed
Fill up README
1 parent 1bf6e81 commit 066228e

File tree

1 file changed

+41
-6
lines changed

1 file changed

+41
-6
lines changed

README.md

Lines changed: 41 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,42 @@
1-
# GenOpt
1+
# GenOpt.jl
22

3-
Still very much a draft. I'll add more details later.
4-
When the user uses the container syntax to create constraints, MOI receives each constraint independently so ExaModels.jl has to recover the repeated structure. We discussed during the JuMP-dev hackathon of a way to communicate the repeated structure through MOI more explicitly and this PR is a POC the resulting design.
5-
As a byproduct, this also gives a solution for https://github.com/jump-dev/JuMP.jl/issues/1654 since the constraint will be expanded at the MOI level with a bridge from FunctionGenerator in Zeros to Scalar?Function in EqualTo (we need a way to know the scalar function type but that should be doable)
6-
We can also recover the pretty printing of containers that we had in JuMP v0.18.
7-
Started as https://github.com/jump-dev/JuMP.jl/pull/3890
3+
[![Build Status](https://github.com/blegat/GenOpt.jl/actions/workflows/ci.yml/badge.svg?branch=main)](https://github.com/blegat/GenOpt.jl/actions?query=workflow%3ACI)
4+
[![codecov](https://codecov.io/gh/blegat/GenOpt.jl/branch/main/graph/badge.svg)](https://codecov.io/gh/blegat/GenOpt.jl)
5+
6+
## License
7+
8+
`GenOpt.jl` is licensed under the [MIT license](https://github.com/blegat/GenOpt.jl/blob/main/LICENSE.md).
9+
10+
## Installation
11+
12+
The package is not registered yet so to install `GenOpt` using `Pkg.add`, do:
13+
```julia
14+
import Pkg
15+
Pkg.add("https://github.com/blegat/GenOpt.jl")
16+
```
17+
18+
## Use with JuMP
19+
20+
To create a group of constraints, give `ParametrizedArray` as `container` keyword:
21+
```julia
22+
using GenOpt
23+
@constraint(
24+
model,
25+
[i in 1:n],
26+
x[1, i] == x0[i],
27+
container = ParametrizedArray,
28+
)
29+
```
30+
To a grouped sum, use `lazy_sum` as follows:
31+
```julia
32+
@objective(
33+
model,
34+
Min,
35+
lazy_sum(0.5 * R[j] * (u[i, j]^2) for i in 1:N, j in 1:p),
36+
)
37+
```
38+
To solve it on GPU with ExaModels, use
39+
```julia
40+
set_optimizer(model, () -> GenOpt.ExaOptimizer(madnlp, CUDABackend()))
41+
optimize!(model)
42+
```

0 commit comments

Comments
 (0)