Releases: CyberAgentAILab/cmaes
v0.13.0
New Algorithms
COMO-CatCMA with Margin [Hamano et al. 2026]
COMO-CatCMA with Margin (COMO-CatCMAwM) performs multi-objective mixed-variable optimization by coordinating multiple CatCMAwM optimizers. It currently supports two-objective problems; support for three or more objectives is planned.
import numpy as np
from cmaes import COMOCatCMAwM
def DSIntLFTL(x, z, c, cat_num):
Sphere1 = sum((x / 10) ** 2) / len(x)
Sphere2 = sum((x / 10 - 1) ** 2) / len(x)
SphereInt1 = sum((z / 10) ** 2) / len(z)
SphereInt2 = sum((z / 10 - 1) ** 2) / len(z)
c_idx = c.argmax(axis=1)
LF = (len(c) - (c_idx == 0).cumprod().sum()) / len(c)
TL = (len(c) - (c_idx == np.asarray(cat_num) - 1)[::-1].cumprod().sum()) / len(c)
obj1 = Sphere1 + SphereInt1 + LF
obj2 = Sphere2 + SphereInt2 + TL
return [obj1, obj2]
if __name__ == "__main__":
# [lower_bound, upper_bound] for each continuous variable
X = [[-5, 15]] * 3
# possible values for each integer variable
Z = [range(-5, 16)] * 3
# number of categories for each categorical variable
C = [5] * 3
optimizer = COMOCatCMAwM(x_space=X, z_space=Z, c_space=C)
evals = 0
while evals < 7000:
solutions = []
for sol in optimizer.ask_iter():
value = DSIntLFTL(sol.x, sol.z, sol.c, C)
evals += 1
solutions.append((sol, value))
optimizer.tell(solutions)
print(evals, optimizer.incumbent_objectives)References
- [Hamano et al. 2026] R. Hamano, M. Nomura, S. Saito, K. Uchida, S. Shirakawa, CatCMA with Margin for Single- and Multi-Objective Mixed-Variable Black-Box Optimization, arXiv:2504.07884, 2026.
What's Changed
- Update the Optuna's example code by @c-bata in #207
- Make cmaes a bit faster by @c-bata in #157
- COMO-CatCMA with Margin by @ha-mano in #211
- Update README.md by @nomuramasahir0 in #212
- Enable OIDC publishing for PyPI by @c-bata in #213
- Remove tests for free-threaded support by @c-bata in #214
- Update the supported Python versions to 3.9-3.14 by @c-bata in #215
- Migrate to ruff from black/isort/flake8 by @c-bata in #216
- Migrate to uv by @c-bata in #217
- Bump the version up to v0.13.0 by @c-bata in #218
- Update the workflow for release by @c-bata in #219
Full Changelog: v0.12.0...v0.13.0
PyPI: https://pypi.org/project/cmaes/
v0.12.0
New Algorithms
CatCMA with Margin (GECCO2025) by @ha-mano
CatCMA with Margin (CatCMAwM) is a method for mixed-variable optimization problems, simultaneously optimizing continuous, integer, and categorical variables. CatCMAwM extends CatCMA by introducing a novel integer handling mechanism, and supports arbitrary combinations of continuous, integer, and categorical variables in a unified framework.
Source code
import numpy as np
from cmaes import CatCMAwM
def SphereIntCOM(x, z, c):
return sum(x * x) + sum(z * z) + len(c) - sum(c[:, 0])
def SphereInt(x, z):
return sum(x * x) + sum(z * z)
def SphereCOM(x, c):
return sum(x * x) + len(c) - sum(c[:, 0])
def f_cont_int_cat():
# [lower_bound, upper_bound] for each continuous variable
X = [[-5, 5], [-5, 5]]
# possible values for each integer variable
Z = [[-1, 0, 1], [-2, -1, 0, 1, 2]]
# number of categories for each categorical variable
C = [3, 3]
optimizer = CatCMAwM(x_space=X, z_space=Z, c_space=C)
for generation in range(50):
solutions = []
for _ in range(optimizer.population_size):
sol = optimizer.ask()
value = SphereIntCOM(sol.x, sol.z, sol.c)
solutions.append((sol, value))
print(f"#{generation} {sol} evaluation: {value}")
optimizer.tell(solutions)
def f_cont_int():
# [lower_bound, upper_bound] for each continuous variable
X = [[-np.inf, np.inf], [-np.inf, np.inf]]
# possible values for each integer variable
Z = [[-2, -1, 0, 1, 2], [-2, -1, 0, 1, 2]]
# initial distribution parameters (Optional)
# If you know a promising solution for X and Z, set init_mean to that value.
init_mean = np.ones(len(X) + len(Z))
init_cov = np.diag(np.ones(len(X) + len(Z)))
init_sigma = 1.0
optimizer = CatCMAwM(
x_space=X, z_space=Z, mean=init_mean, cov=init_cov, sigma=init_sigma
)
for generation in range(50):
solutions = []
for _ in range(optimizer.population_size):
sol = optimizer.ask()
value = SphereInt(sol.x, sol.z)
solutions.append((sol, value))
print(f"#{generation} {sol} evaluation: {value}")
optimizer.tell(solutions)
def f_cont_cat():
# [lower_bound, upper_bound] for each continuous variable
X = [[-5, 5], [-5, 5]]
# number of categories for each categorical variable
C = [3, 5]
# initial distribution parameters (Optional)
init_cat_param = np.array(
[
[0.5, 0.3, 0.2, 0.0, 0.0], # zero-padded at the end
[0.2, 0.2, 0.2, 0.2, 0.2], # each row must sum to 1
]
)
optimizer = CatCMAwM(x_space=X, c_space=C, cat_param=init_cat_param)
for generation in range(50):
solutions = []
for _ in range(optimizer.population_size):
sol = optimizer.ask()
value = SphereCOM(sol.x, sol.c)
solutions.append((sol, value))
print(f"#{generation} {sol} evaluation: {value}")
optimizer.tell(solutions)
if __name__ == "__main__":
f_cont_int_cat()
# f_cont_int()
# f_cont_cat()We recommend using CatCMAwM for continuous+integer and continuous+categorical settings. In particular, [Hamano et al. 2025] shows that CatCMAwM outperforms CMA-ES with Margin in mixed-integer scenarios. Therefore, we suggest CatCMAwM in place of CMA-ES with Margin or CatCMA.
CMA-ES-SoP by (PPSN 2024) @kento031
CMA-ES on sets of points (CMA-ES-SoP) is a variant of CMA-ES for optimization on sets of points. In the optimization on sets of points, the search space consists of several disjoint subspaces containing multiple possible points where the objective function value can be computed. In the mixed-variable cases, some subspaces are continuous spaces. Note that the discrete subspaces with more than five dimensions require computational cost for the construction of the Voronoi diagrams.
Source code
import numpy as np
from cmaes.cma_sop import CMASoP
# numbers of dimensions in each subspace
subspace_dim_list = [2, 3, 5]
cont_dim = 10
# numbers of points in each subspace
point_num_list = [10, 20, 40]
# number of total dimensions
dim = int(np.sum(subspace_dim_list) + cont_dim)
# objective function
def quadratic(x):
coef = 1000 ** (np.arange(dim) / float(dim - 1))
return np.sum((coef * x) ** 2)
# sets_of_points (on [-5, 5])
discrete_subspace_num = len(subspace_dim_list)
sets_of_points = [(
2 * np.random.rand(point_num_list[i], subspace_dim_list[i]) - 1) * 5
for i in range(discrete_subspace_num)]
# add the optimal solution (for benchmark function)
for i in range(discrete_subspace_num):
sets_of_points[i][-1] = np.zeros(subspace_dim_list[i])
np.random.shuffle(sets_of_points[i])
# optimizer (CMA-ES-SoP)
optimizer = CMASoP(
sets_of_points=sets_of_points,
mean=np.random.rand(dim) * 4 + 1,
sigma=2.0,
)
best_eval = np.inf
eval_count = 0
for generation in range(400):
solutions = []
for _ in range(optimizer.population_size):
# Ask a parameter
x, enc_x = optimizer.ask()
value = quadratic(enc_x)
# save best eval
best_eval = np.min((best_eval, value))
eval_count += 1
solutions.append((x, value))
# Tell evaluation values.
optimizer.tell(solutions)
print(f"#{generation} ({best_eval} {eval_count})")
if best_eval < 1e-4 or optimizer.should_stop():
breakMaximum a Posteriori CMA-ES (PPSN 2024) by @ha-mano
MAP-CMA is a method that is introduced to interpret the rank-one update in the CMA-ES from the perspective of the natural gradient.
The rank-one update derived from the natural gradient perspective is extensible, and an additional term, called momentum update, appears in the update of the mean vector.
The performance of MAP-CMA is not significantly different from that of CMA-ES, as the primary motivation for MAP-CMA comes from the theoretical understanding of CMA-ES.
Source code
import numpy as np
from cmaes import MAPCMA
def rosenbrock(x):
dim = len(x)
if dim < 2:
raise ValueError("dimension must be greater one")
return sum(100 * (x[:-1] ** 2 - x[1:]) ** 2 + (x[:-1] - 1) ** 2)
if __name__ == "__main__":
dim = 20
optimizer = MAPCMA(mean=np.zeros(dim), sigma=0.5, momentum_r=dim)
print(" evals f(x)")
print("====== ==========")
evals = 0
while True:
solutions = []
for _ in range(optimizer.population_size):
x = optimizer.ask()
value = rosenbrock(x)
evals += 1
solutions.append((x, value))
if evals % 1000 == 0:
print(f"{evals:5d} {value:10.5f}")
optimizer.tell(solutions)
if optimizer.should_stop():
breakSafe CMA-ES (GECCO 2024) by @kento031
Safe CMA-ES is a variant of CMA-ES for safe optimization. Safe optimization is formulated as a special type of constrained optimization problem aiming to solve the optimization problem with fewer evaluations of the solutions whose safety function values exceed the safety thresholds. The safe CMA-ES requires safe seeds that do not violate the safety constraints. Note that the safe CMA-ES is designed for noiseless safe optimization. This module needs torch and gpytorch.
Source code
import numpy as np
from cmaes.safe_cma import SafeCMA
# objective function
def quadratic(x):
coef = 1000 ** (np.arange(dim) / float(dim - 1))
return np.sum((x * coef) ** 2)
# safety function
def safe_function(x):
return x[0]
"""
example with a single safety function
"""
if __name__ == "__main__":
# number of dimensions
dim = 5
# safe seeds
safe_seeds_num = 10
safe_seeds = (np.random.rand(safe_seeds_num, dim) * 2 - 1) * 5
safe_seeds[:,0] = - np.abs(safe_seeds[:,0])
# evaluation of safe seeds (with a single safety function)
seeds_evals = np.array([ quadratic(x) for x in safe_seeds ])
seeds_safe_evals = np.stack([ [safe_function(x)] for x in safe_seeds ])
safety_threshold = np.array([0])
# optimizer (safe CMA-ES)
optimizer = SafeCMA(
sigma=1.,
safety_threshold=safety_threshold,
safe_seeds=safe_seeds,
seeds_evals=seeds_evals,
seeds_safe_evals=seeds_safe_evals,
)
unsafe_eval_counts = 0
best_eval = np.inf
for generation in range(400):
solutions = []
for _ in range(optimizer.population_size):
# Ask a parameter
x = optimizer.ask()
value = quadratic(x)
safe_value = np.array([safe_function(x)])
# save best eval
best_eval = np.min((best_eval, value))
unsafe_eval_counts += (safe_value > safety_threshold)
solutions.append((x, value, safe_value))
# Tell evaluation values.
optimizer.tell(solutions)
print(f"#{generation} ({best_eval} {unsafe_eval_counts})")
if optimizer.should_stop():
breakWhat's Changed
v0.11.1
What's Changed
Full Changelog: v0.11.0...v0.11.1
v0.11.0
Highlights
CatCMA [Hamano+, GECCO2024]
arXiv: https://arxiv.org/pdf/2405.09962
CatCMA is a method for mixed-category optimization problems, which is the problem of simultaneously optimizing continuous and categorical variables. CatCMA employs the joint probability distribution of multivariate Gaussian and categorical distributions as the search distribution.
Usage is like below:
import numpy as np
from cmaes import CatCMA
def sphere_com(x, c):
dim_co = len(x)
dim_ca = len(c)
if dim_co < 2:
raise ValueError("dimension must be greater one")
sphere = sum(x * x)
com = dim_ca - sum(c[:, 0])
return sphere + com
def rosenbrock_clo(x, c):
dim_co = len(x)
dim_ca = len(c)
if dim_co < 2:
raise ValueError("dimension must be greater one")
rosenbrock = sum(100 * (x[:-1] ** 2 - x[1:]) ** 2 + (x[:-1] - 1) ** 2)
clo = dim_ca - (c[:, 0].argmin() + c[:, 0].prod() * dim_ca)
return rosenbrock + clo
def mc_proximity(x, c, cat_num):
dim_co = len(x)
dim_ca = len(c)
if dim_co < 2:
raise ValueError("dimension must be greater one")
if dim_co != dim_ca:
raise ValueError(
"number of dimensions of continuous and categorical variables "
"must be equal in mc_proximity"
)
c_index = np.argmax(c, axis=1) / cat_num
return sum((x - c_index) ** 2) + sum(c_index)
if __name__ == "__main__":
cont_dim = 5
cat_dim = 5
cat_num = np.array([3, 4, 5, 5, 5])
# cat_num = 3 * np.ones(cat_dim, dtype=np.int64)
optimizer = CatCMA(mean=3.0 * np.ones(cont_dim), sigma=1.0, cat_num=cat_num)
for generation in range(200):
solutions = []
for _ in range(optimizer.population_size):
x, c = optimizer.ask()
value = mc_proximity(x, c, cat_num)
if generation % 10 == 0:
print(f"#{generation} {value}")
solutions.append(((x, c), value))
optimizer.tell(solutions)
if optimizer.should_stop():
breakWhat's Changed
- Add support for Python 3.12 by @c-bata in #153
- Remove
setup.pyand use build module by @c-bata in #154 - Fix CI failures by @c-bata in #158
- get mean by @nomuramasahir0 in #159
- add question template by @nomuramasahir0 in #162
- fix sigma setting by @nomuramasahir0 in #160
- Add GitHub action setting for continuous benchmark by @c-bata in #168
- fix typo by @nomuramasahir0 in #169
- fix BIPOP-CMA in visualization by @nomuramasahir0 in #170
- update readme by @nomuramasahir0 in #171
- update by @nomuramasahir0 in #172
- fix the old_sigma assertion when lr_adapt=True by @Kreyparion in #174
- remove kurobako dependency by @nomuramasahir0 in #175
- Support for numpy v2.0 by @porink0424 in #177
- catcma (GECCO2024) by @ha-mano in #178
- fix CatCMA by @ha-mano in #179
- Update README.md by @ha-mano in #181
- Bump the version up to
v0.11.0by @c-bata in #183
New Contributors
- @Kreyparion made their first contribution in #174
- @porink0424 made their first contribution in #177
- @ha-mano made their first contribution in #178
Full Changelog: v0.10.0...v0.11.0
v0.10.0
What's Changed
- add DX-NES-IC by @nomuramasahir0 in #149
- xNES implementation by @nomuramasahir0 in #150
- add LRA-CMA-ES by @nomuramasahir0 in #151
- Bump the version up to v0.10.0 by @c-bata in #152
Full Changelog: v0.9.1...v0.10.0
v0.9.1
What's Changed
- Remove
tox.iniby @c-bata in #131 - Fix a broken link to Optuna's documentation by @c-bata in #132
- Drop Python 3.6 support. by @c-bata in #130
- Reuse
CMAinsideCMAwMby @knshnb in #133 - Add rng related methods by @knshnb in #135
- Fix correction of out-of-range continuous params of
CMAwMby @knshnb in #134 - Fix correction of out-of-range discrete params of
CMAwMby @knshnb in #136 - Avoid to use
typing.List,typing.Dict, andtyping.Tuple. by @c-bata in #139 - Check feasibility of sampled discrete parameters in
CMAwMby @knshnb in #140 - Refactor
CMAwMby @knshnb in #141 - Add a test case for no discrete spaces. by @c-bata in #143
- Allow no discrete spaces in
CMAwMby @knshnb in #142 - Remove warnings in CMAwM class by @c-bata in #144
- Revert handling of infeasible discrete parameters by @knshnb in #145
- Bump the version up to v0.9.1 by @c-bata in #138
Full Changelog: v0.9.0...v0.9.1
v0.9.0
Highlights
CMA-ES with Margin is now available. It introduces a lower bound on the marginal probability associated with each discrete dimension so that samples can avoid being fixed to a single point. It can be applied to mixed spaces of continuous (float) and discrete (including integer and binary). This algorithm is proposed by Hamano, Saito, @nomuramasahir0 (a maintainer of this library), and Shirakawa, has been nominated as best paper at GECCO'22 ENUM track.
CMA-ES CMA-ESwM The above figures are taken from EvoConJP/CMA-ES_with_Margin.
Please check out the following examples for the usage.
What's Changed
- Running benchmark of Warm Starting CMA-ES on GitHub Actions. by @c-bata in #99
- Validate bounds domain contains mean by @c-bata in #100
- Fix overflow errors uncovered by Coverage-guided Fuzzing. by @c-bata in #104
- Fuzzing for sep-CMA-ES by @c-bata in #105
- Set license_file on setup.cfg by @c-bata in #106
- fix sep-CMA description by @nomuramasahir0 in #107
- Temporarily disable a GitHub action for kurobako benchmarks by @c-bata in #113
- Fix mutable by @nomuramasahir0 in #112
- Run tests with Python 3.10 by @c-bata in #109
- Update author and maintainer package info. by @c-bata in #116
- Introduce some related projects on README by @c-bata in #118
- Migrate the project metadata to pyproject.toml by @c-bata in #119
- Revert #119 to support Python 3.6. by @c-bata in #122
- Support CMA-ES with Margin. by @knshnb in #121
- Add integer examples for CMA-ES with Margin by @nomuramasahir0 in #125
- Support Python 3.11 by @c-bata in #123
- Add README of CMA-ES with margin by @knshnb in #124
- Follow-up #126: Remove Scipy dependency by @c-bata in #127
- Remove SciPy dependency by @amylase in #126
- Use gh instead of ghr by @c-bata in #128
- Bump the version up to v0.9.0 by @c-bata in #129
New Contributors
References
Full Changelog: v0.8.2...v0.9.0
v0.8.2
CHANGES
- Fix dimensions of Warm starting CMA-ES (#98).
- Thank you @Yibinjiang for reporting the bug.
v0.8.1
CHANGES
- Unset version constraint of numpy.
- Remove
extra_requiresfor development.
v0.8.0
CHANGES
New features
Warm-starting CMA-ES is now available. It estimates a promising distribution, then generates parameters of the multivariate gaussian distribution used for the initialization of CMA-ES, so that you can exploit optimization results from a similar optimization task. This algorithm is proposed by @nmasahiro, a maintainer of this library, and accepted at AAAI 2021.
| Rot Ellipsoid | Ellipsoid |
|---|---|
![]() |
![]() |





