Skip to content

Commit bdeb819

Browse files
authored
Update Readme (#10)
* update readme
1 parent ad9efe4 commit bdeb819

File tree

1 file changed

+10
-0
lines changed

1 file changed

+10
-0
lines changed

README.md

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -20,6 +20,8 @@
2020

2121
Built for machine learning practitioners requiring flexible and robust hyperparameter tuning, **ConfOpt** delivers superior optimization performance through conformal uncertainty quantification and a wide selection of surrogate models.
2222

23+
**ConfOpt** also lends itself well to HPO research and as an add-on, requiring limited [dependancies](https://github.com/rick12000/confopt/blob/main/requirements.txt) and focusing on pure search methodology.
24+
2325
## 📦 Installation
2426

2527
Install ConfOpt from PyPI using pip:
@@ -110,6 +112,14 @@ Finally, we retrieve the optimization's best parameters and test accuracy score
110112

111113
For detailed examples and explanations see the [documentation](https://confopt.readthedocs.io/).
112114

115+
## 🔗 Integrations
116+
117+
Advanced users should note **ConfOpt** doesn't currently support parallelization, multi-fidelity optimization and multi-objective optimization.
118+
119+
If you wish to use **ConfOpt** with parallelization or multi-fidelity/pruning, fear not, there's an [Optuna](https://github.com/optuna) integration that supports both. Parallelization support has been well tested, while multi-fidelity/pruning is still experimental (it should work well and has been spot tested, but if there are any problems please raise an issue).
120+
121+
For instructions on how to use **ConfOpt** in Optuna refer to the official documentation [here](https://hub.optuna.org/samplers/confopt_sampler/).
122+
113123
## 📚 Documentation
114124

115125
### **User Guide**

0 commit comments

Comments
 (0)