Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
48 commits
Select commit Hold shift + click to select a range
7d721bd
Add preliminary work on optimization interface with Python implementa…
dmitry-kabanov Nov 12, 2025
d6a929b
Add WIP on optimization interface from Python
dmitry-kabanov Nov 14, 2025
fb23517
[docs] Add technote on development of optimization interface
dmitry-kabanov Nov 20, 2025
9a4f5f8
Add notes on optimization interfaces in JuMP and Scipy Optimize
dmitry-kabanov Nov 24, 2025
3e03daa
[docs] Add missing files for version bump to technote
dmitry-kabanov Nov 27, 2025
b02a704
Try to build minimal example for nonlinear optimization
dmitry-kabanov Nov 27, 2025
7cbf36d
[misc] Update clang-tidy config to avoid warnings about Python C API …
dmitry-kabanov Dec 10, 2025
181eb5b
WIP to have return types
dmitry-kabanov Dec 10, 2025
fb8c2ef
Expand call_impl interface to include return arguments
dmitry-kabanov Dec 16, 2025
e3abce4
[python, bridge] Accept return value and check that the number of exp…
dmitry-kabanov Dec 16, 2025
8e39405
[python, converter] Pass return_args argument as a pointer as it will…
dmitry-kabanov Dec 16, 2025
cc09aa8
[python, converter] Work on reading return_args and freeing them afte…
dmitry-kabanov Dec 16, 2025
d25303c
[python, bridge] Correct convert Python string object to C null-termi…
dmitry-kabanov Dec 16, 2025
f3de523
[python, converter] Work on convert return values to proper Python va…
dmitry-kabanov Dec 17, 2025
b805a37
[python, bridge] Fix memory leak when converting Python string to C s…
dmitry-kabanov Dec 17, 2025
e673c1a
[python] Correctly convert return Python objects to C intermediate re…
dmitry-kabanov Dec 18, 2025
a15e276
[python, converter] Correctly convert C intermediate return args to P…
dmitry-kabanov Dec 18, 2025
1e55ec5
[python, converter] Rename core.py -> converter.py to match used term…
dmitry-kabanov Dec 18, 2025
edefdd7
[python] Rename converter back to core
dmitry-kabanov Dec 19, 2025
26821ed
[scipy_optimize] Return arguments from `minimize` function
dmitry-kabanov Dec 19, 2025
8214579
[tests, python] Update tests for `optim` interface
dmitry-kabanov Dec 19, 2025
ec04394
[python, optim] Implement `set_user_data` method
dmitry-kabanov Dec 19, 2025
3b0eff5
[python, Optim] Clean code - docstrings, redundant code
dmitry-kabanov Dec 19, 2025
12128e3
[tests, python] Add test for `optim::set_user_data` function
dmitry-kabanov Dec 19, 2025
551b192
[python] Implement method `optim::set_method` in `scipy_optimize`
dmitry-kabanov Jan 5, 2026
a9787f6
[python] Add method `optim::set_method` to the Optim gateway
dmitry-kabanov Jan 5, 2026
c6d674a
[docs] Update documentation
dmitry-kabanov Jan 5, 2026
c35424f
[python, tests] Add tests for optimization results getting better wit…
dmitry-kabanov Jan 5, 2026
42e8870
[examples] Add initial example for optimization in Python
dmitry-kabanov Jan 5, 2026
887d8f7
[python, misc] Fix wrong constant name
dmitry-kabanov Jan 6, 2026
e352e1d
[python, bridge] Allocate memory for a return integer only if all che…
dmitry-kabanov Jan 6, 2026
96e270d
[python, scipy_optimize] Use correct variable `method_name` instead o…
dmitry-kabanov Jan 6, 2026
295f097
[python] Use correct data type identifiers
dmitry-kabanov Jan 6, 2026
d574e5d
[python, tests] Correct the test for optimization result improving wi…
dmitry-kabanov Jan 6, 2026
271c91e
[python, optim::set_method] Check that passed method parameters are a…
dmitry-kabanov Jan 7, 2026
bccf81f
[python, scipy_optimize] Type hint properties correctly
dmitry-kabanov Jan 8, 2026
7c6e192
[python, scipy_optimize] Check available method options correctly
dmitry-kabanov Jan 8, 2026
851a5b0
[python, scipy_optimize] Check preconditions for minimize
dmitry-kabanov Jan 8, 2026
9b6f56b
[python, converter] Raise exception on the error before converting re…
dmitry-kabanov Jan 8, 2026
a5dc1e5
[python, Optim gateway] Hold reference to user_data
dmitry-kabanov Jan 12, 2026
6e22116
[python, converter] Add a reminder to `make_oif_user_data` docstring …
dmitry-kabanov Jan 12, 2026
b38a3f2
[python, Bridge] Initialize Python objects to NULL
dmitry-kabanov Jan 12, 2026
e011116
Fix style issues found by pre-commit
dmitry-kabanov Jan 12, 2026
4480c1e
[python, Bridge] Load libpython and NumPy only once
dmitry-kabanov Jan 12, 2026
7f80bf6
Fix pre-commit issues
dmitry-kabanov Jan 12, 2026
64058f5
[Julia, Converter] Handle situation when no return args are passed fo…
dmitry-kabanov Jan 13, 2026
534e593
[python, Optim gateway] Keep reference to x0 to extend its lifetime
dmitry-kabanov Jan 14, 2026
44b1826
Fix format issues found by pre-commit
dmitry-kabanov Jan 14, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion .clang-tidy
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
---
# This file is taken from: https://nrk.neocities.org/articles/c-static-analyzers
Checks: >
performance-*,
misc-*,
Expand All @@ -26,3 +25,5 @@ WarningsAsErrors: '*'
CheckOptions:
- key: bugprone-assert-side-effect.AssertMacros
value: 'ASSERT'
- key: misc-include-cleaner.IgnoreHeaders
value: '["<Python.h>"]'
7 changes: 4 additions & 3 deletions dispatch/dispatch.c
Original file line number Diff line number Diff line change
Expand Up @@ -321,7 +321,8 @@ unload_interface_impl(ImplHandle implh)
}

int
call_interface_impl(ImplHandle implh, const char *method, OIFArgs *in_args, OIFArgs *out_args)
call_interface_impl(ImplHandle implh, const char *method, OIFArgs *in_args, OIFArgs *out_args,
OIFArgs *return_args)
{
int status;

Expand All @@ -343,7 +344,7 @@ call_interface_impl(ImplHandle implh, const char *method, OIFArgs *in_args, OIFA
}

void *lib_handle = OIF_DISPATCH_HANDLES[dh];
int (*call_impl_fn)(ImplInfo *, const char *, OIFArgs *, OIFArgs *);
int (*call_impl_fn)(ImplInfo *, const char *, OIFArgs *, OIFArgs *, OIFArgs *);
call_impl_fn = dlsym(lib_handle, "call_impl");
if (call_impl_fn == NULL) {
logerr(prefix_,
Expand All @@ -352,7 +353,7 @@ call_interface_impl(ImplHandle implh, const char *method, OIFArgs *in_args, OIFA
OIF_LANG_FROM_LANG_ID[dh]);
return -3;
}
status = call_impl_fn(impl_info, method, in_args, out_args);
status = call_impl_fn(impl_info, method, in_args, out_args, return_args);

if (status) {
logerr(prefix_,
Expand Down
135 changes: 135 additions & 0 deletions docs/source/api/api-python/optim/index.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,135 @@
optim
=====

.. py:module:: optim

.. autoapi-nested-parse::

This module defines the interface for solving minimization problems:

.. math::
minimize_x f(x)

where :math:`f : \mathbb R^n \to \mathbb R`.







Module Contents
---------------

.. py:type:: ObjectiveFn
:canonical: Callable[[np.ndarray, object], int]


Signature of the objective function :math:`f(, y)`.

!!!!!!!!! The function accepts four arguments:
!!!!!!!!! - `t`: current time,
!!!!!!!!! - `y`: state vector at time :math:`t`,
!!!!!!!!! - `ydot`: output array to which the result of function evalutation is stored,
!!!!!!!!! - `user_data`: additional context (user-defined data) that
!!!!!!!!! must be passed to the function (e.g., parameters of the system).

.. py:class:: OptimResult

.. py:attribute:: status
:type: int


.. py:attribute:: x
:type: numpy.ndarray


.. py:class:: Optim(impl: str)

Interface for solving optimization (minimization) problems.

This class serves as a gateway to the implementations of the
solvers for optimization problems.

:param impl: Name of the desired implementation.
:type impl: str

.. rubric:: Examples

Let's solve the following initial value problem:

.. math::
y'(t) = -y(t), \quad y(0) = 1.

First, import the necessary modules:
>>> import numpy as np
>>> from oif.interfaces.ivp import IVP

Define the right-hand side function:

>>> def rhs(t, y, ydot, user_data):
... ydot[0] = -y[0]
... return 0 # No errors, optional

Now define the initial condition:

>>> y0, t0 = np.array([1.0]), 0.0

Create an instance of the IVP solver using the implementation "jl_diffeq",
which is an adapter to the `OrdinaryDiffeq.jl` Julia package:

>>> s = IVP("jl_diffeq")

We set the initial value, the right-hand side function, and the tolerance:

>>> s.set_initial_value(y0, t0)
>>> s.set_rhs_fn(rhs)
>>> s.set_tolerances(1e-6, 1e-12)

Now we integrate to time `t = 1.0` in a loop, outputting the current value
of `y` with time step `0.1`:

>>> t = t0
>>> times = np.linspace(t0, t0 + 1.0, num=11)
>>> for t in times[1:]:
... s.integrate(t)
... print(f"{t:.1f} {s.y[0]:.6f}")
0.1 0.904837
0.2 0.818731
0.3 0.740818
0.4 0.670320
0.5 0.606531
0.6 0.548812
0.7 0.496585
0.8 0.449329
0.9 0.406570
1.0 0.367879


.. py:attribute:: x0
:type: numpy.ndarray

Current value of the state vector.


.. py:attribute:: status
:value: -1



.. py:attribute:: x
:type: numpy.ndarray


.. py:method:: set_initial_guess(x0: numpy.ndarray)

Set initial guess for the optimization problem



.. py:method:: set_objective_fn(objective_fn: ObjectiveFn)


.. py:method:: minimize()

Integrate to time `t` and write solution to `y`.
5 changes: 4 additions & 1 deletion docs/source/technotes/2025-05-05-how-to-release.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,10 @@
* `pyproject.toml`
* `version.py`
* `conf.py`
- [ ] Run pre-commit
* `Project.toml`
* `Manifest.toml`
* `CITATION.cff`
- [ ] Run `pre-commit run --all-files`
- [ ] Make a pull request and make sure it passes all the checks
- [ ] Merge the pull request to `main`
- [ ] Go to the GitHub page of the repository and press "Releases"
Expand Down
150 changes: 150 additions & 0 deletions docs/source/technotes/2025-11-20-optimization-interface.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,150 @@
# 2025-11-20 Optimization interface

Optimization interface should accommodate adapting implementations
for general constrained nonlinear programming:
```math
minimize_{x \in \mathbb R^n} &f(x)
subject to & l <= x <= u,
& l2 <= g(x) <= u2,
```
where $f(x) : \mathbb R^n \to \mathbb R$ and $g(x)$ are nonlinear functions of $x$.

## Julia package JuMP.jl

Julia package /JuMP/ (Julia Mathematical Programming) is a metapackage
that contains a lot of packages for different types of optimization problems:
for example, for integer programming, for linear programming, nonlinear, etc.
It also supports solvers with commercial licenses.

See:
https://jump.dev/JuMP.jl/stable/tutorials/getting_started/getting_started_with_JuMP

A part of JuMP is `MathOptInterface` that hides the differences
behind a common interface.

Let's try to solve some simple problem using IPOpt which is continuous
nonlinear programming solver.

The problem is
```math
minimize \sum_{i=1}^{N}x_i^2
```
which is convex, so we should be able to converge to the solution
from any initial guess.

The actual program in Julia would be
```julia
using JuMP
using Ipopt

model = Model(Ipopt.Optimizer)
set_attrbitute(model, "output_flag", false)

# Variables
@variable(model, x)

@objective(model, Min, sum(x.^2))

optimize!(model)
```

One can check the final status of the optimization process via
```
is_solved_and_feasible(model)

termination_status(model)
```
Here, `termination_status` returns constants from the `MathOptInterface`:
there are constants like `OPTIMAL` (global solution),
`LOCALLY_SOLVED` (local minimum), `INFEASIBLE`, etc.

Statements for defining optimizing variables are macros:
```
@variable(model, x)
```
so here `x` is a symbol, and the whole thing is transformed to actual code
(probably, something like `x = variable(model, 'x')`; they do not explain it).

To set constraints or vector variables:
```
@variable(model, -5 <= x[i=1:42] <= 7)
```
Also, upper and lower bounds can be set via keywords arguments to this macro.

Interestingly, when I do
```
typeof(x)
```
then `x` is `VariableRef`.


## SciPy Optimize

It has multiple solvers, although they have slightly different interfaces
and features: some are working with constrained optimization, some do not.

Solution happens through a single function:
```python
from scipy import optimize

x = minimize(
f, # Objective function
x0, # Initial guess
method="method-name", # Methods are below
args=(a, b, c, ...), # Args that are passed unfolded to f
jac=None, # Callback | True, if f returns obj and jac together
hess=None, # Callback for Hessian matrix
hessp=None, # Callback for computing Hessian-vector product
options={}, # Dictionary of options
)
```

The interface is general, and not all solvers (methods) use all arguments.

Method names are case-insensitive.

Solvers that use only the objective:
- `Nelder-Mead`
- `Powell`

Solvers that use objective and Jacobian:
- `BFGS` Broyden-Fletcher-Goldfarb-Shanno
(can estimate Jacobian via finite differences)

Take also Hessian or hessian product
- `Newton-CG` (Newton Conjugate Gradient) H or Hp
- `trust-NCG` Trust-region Newton-Conjugate Gradient
- `trust-Krylov`

Only Hessian:
- `trust-exact` Trust-region Nearly Exact Algorithm: decomposes Hessian via
Cholesky factorization

### Scipy Optimize: Constrained minimization

Methods are:
- `trust-constr`
- `SLSQP`
- `COBYLA`
- `COBYQA`

**Complication**: they use different interfaces to specify constraints:
`SLSQP` uses a dictionary, while others `LinearConstraint` and
`NOnlinearConstraint` instances.

Linear constraints are written as
\[
\begin{matrix}
c_1^\ell \\
\dots
c_L^\ell
\end{matrix}
\leq
A x
\leq
\begin{matrix}
c_1^u \\
\dots
c_L^u
\end{matrix}
\]
18 changes: 18 additions & 0 deletions docs/source/technotes/2025-12-10-out-and-return-args.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
# Extended Dispatch/Bridge API

So far, we have an API like this:
```
int call_interface_impl(implh, func_name, in_args, out_args);
```
where `in_args` and `out_args` are arrays of input and output arguments.
Output arguments are provided by the caller and the callee writes into them.

Now we want an extended API:
```
int call_interface_impl(implh, func_name, in_args, out_args, return_args)
```
where `return_args` is a semi-filled array: it contains the data types
and the number of arguments, but it is Bridge that allocates the memory
and writes into this arguments passing the ownership back to the caller
(a Converter to be precise) that converts it the return data again
from the C intermediate representation to the native language of the user.
Loading
Loading