From d4ddf71dab3952c9bcfb18fdb6278e8e1b67c27b Mon Sep 17 00:00:00 2001 From: Riccardo Doyle Date: Sat, 25 Oct 2025 14:27:35 +0100 Subject: [PATCH 1/2] update readme --- README.md | 10 ++++++++++ 1 file changed, 10 insertions(+) diff --git a/README.md b/README.md index 61b49c4..62112d6 100644 --- a/README.md +++ b/README.md @@ -20,6 +20,8 @@ Built for machine learning practitioners requiring flexible and robust hyperparameter tuning, **ConfOpt** delivers superior optimization performance through conformal uncertainty quantification and a wide selection of surrogate models. +**ConfOpt** also lends itself well to HPO research and as an add-on, requiring limited [dependancies](https://github.com/rick12000/confopt/blob/main/requirements.txt) and focusing on pure search methodology. + ## 📦 Installation Install ConfOpt from PyPI using pip: @@ -110,6 +112,14 @@ Finally, we retrieve the optimization's best parameters and test accuracy score For detailed examples and explanations see the [documentation](https://confopt.readthedocs.io/). +## 🔗 Integrations + +Advanced users should note **ConfOpt** doesn't currently support parallelization, multi-fidelity optimization and multi-objective optimization. + +If you wish to use **ConfOpt** with parallelization or multi-fidelity/pruning, fear not, there's an [Optuna](https://github.com/optuna) integration that supports both. + +For instructions on how to use **ConfOpt** in Optuna refer to the official documentation [here](https://hub.optuna.org/samplers/confopt_sampler/). + ## 📚 Documentation ### **User Guide** From 1bfcd7c2770c00b76231c59f4af76fb35887052f Mon Sep 17 00:00:00 2001 From: Riccardo Doyle Date: Sat, 25 Oct 2025 14:40:24 +0100 Subject: [PATCH 2/2] update readme --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 62112d6..afb2bf0 100644 --- a/README.md +++ b/README.md @@ -116,7 +116,7 @@ For detailed examples and explanations see the [documentation](https://confopt.r Advanced users should note **ConfOpt** doesn't currently support parallelization, multi-fidelity optimization and multi-objective optimization. -If you wish to use **ConfOpt** with parallelization or multi-fidelity/pruning, fear not, there's an [Optuna](https://github.com/optuna) integration that supports both. +If you wish to use **ConfOpt** with parallelization or multi-fidelity/pruning, fear not, there's an [Optuna](https://github.com/optuna) integration that supports both. Parallelization support has been well tested, while multi-fidelity/pruning is still experimental (it should work well and has been spot tested, but if there are any problems please raise an issue). For instructions on how to use **ConfOpt** in Optuna refer to the official documentation [here](https://hub.optuna.org/samplers/confopt_sampler/).