diff --git a/README.md b/README.md index 61b49c4..afb2bf0 100644 --- a/README.md +++ b/README.md @@ -20,6 +20,8 @@ Built for machine learning practitioners requiring flexible and robust hyperparameter tuning, **ConfOpt** delivers superior optimization performance through conformal uncertainty quantification and a wide selection of surrogate models. +**ConfOpt** also lends itself well to HPO research and as an add-on, requiring limited [dependancies](https://github.com/rick12000/confopt/blob/main/requirements.txt) and focusing on pure search methodology. + ## 📦 Installation Install ConfOpt from PyPI using pip: @@ -110,6 +112,14 @@ Finally, we retrieve the optimization's best parameters and test accuracy score For detailed examples and explanations see the [documentation](https://confopt.readthedocs.io/). +## 🔗 Integrations + +Advanced users should note **ConfOpt** doesn't currently support parallelization, multi-fidelity optimization and multi-objective optimization. + +If you wish to use **ConfOpt** with parallelization or multi-fidelity/pruning, fear not, there's an [Optuna](https://github.com/optuna) integration that supports both. Parallelization support has been well tested, while multi-fidelity/pruning is still experimental (it should work well and has been spot tested, but if there are any problems please raise an issue). + +For instructions on how to use **ConfOpt** in Optuna refer to the official documentation [here](https://hub.optuna.org/samplers/confopt_sampler/). + ## 📚 Documentation ### **User Guide**