You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+15-13Lines changed: 15 additions & 13 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -38,7 +38,7 @@ pip install -e .
38
38
39
39
## 🎯 Getting Started
40
40
41
-
The example below shows how to optimize hyperparameters for a RandomForest classifier.
41
+
The example below shows how to optimize hyperparameters for a RandomForest classifier. You can find more examples in the [documentation](https://confopt.readthedocs.io/).
42
42
43
43
### Step 1: Import Required Libraries
44
44
@@ -50,7 +50,7 @@ from sklearn.datasets import load_wine
50
50
from sklearn.model_selection import train_test_split
51
51
from sklearn.metrics import accuracy_score
52
52
```
53
-
We import the necessary libraries for tuning and model evaluation. The `load_wine` function is used to load the wine dataset, which serves as our example data for optimizing the hyperparameters of the RandomForest classifier.
53
+
We import the necessary libraries for tuning and model evaluation. The `load_wine` function is used to load the wine dataset, which serves as our example data for optimizing the hyperparameters of the RandomForest classifier (the dataset is trivial and we can easily reach 100% accuracy, this is for example purposes only).
This function defines the objective we want to optimize. It loads the wine dataset, splits it into training and testing sets, and trains a RandomForest model using the provided configuration. The function returns the accuracy score, which serves as the optimization metric.
75
+
This function defines the objective we want to optimize. It loads the wine dataset, splits it into training and testing sets, and trains a RandomForest model using the provided configuration. The function returns test accuracy, which will be the objective value ConfOpt will optimize for.
Here, we specify the search space for hyperparameters. This includes defining the range for the number of estimators, the proportion of features to consider when looking for the best split, and the criterion for measuring the quality of a split.
86
+
Here, we specify the search space for hyperparameters. In this Random Forest example, this includes defining the range for the number of estimators, the proportion of features to consider when looking for the best split, and the criterion for measuring the quality of a split.
87
87
88
88
### Step 4: Create and Run the Tuner
89
89
@@ -95,7 +95,7 @@ tuner = ConformalTuner(
95
95
)
96
96
tuner.tune(max_searches=50, n_random_searches=10)
97
97
```
98
-
We initialize the `ConformalTuner` with the objective function and search space. The tuner is then run to find the best hyperparameters by maximizing the accuracy score.
98
+
We initialize the `ConformalTuner` with the objective function and search space. The `tune` method then kickstarts hyperparameter search and finds the hyperparameters that maximize test accuracy.
If you'd like to contribute, please email [r.doyle.edu@gmail.com](mailto:r.doyle.edu@gmail.com) with a quick summary of the feature you'd like to add and we can discuss it before setting up a PR!
141
+
142
+
If you want to contribute a fix relating to a new bug, first raise an [issue](https://github.com/rick12000/confopt/issues) on GitHub, then email [r.doyle.edu@gmail.com](mailto:r.doyle.edu@gmail.com) referencing the issue. Issues will be regularly monitored, only send an email if you want to contribute a fix.
141
143
142
144
## 📄 License
143
145
@@ -148,5 +150,5 @@ TBI
148
150
<divalign="center">
149
151
<strong>Ready to take your hyperparameter optimization to the next level?</strong><br>
Copy file name to clipboardExpand all lines: docs/roadmap.rst
+1Lines changed: 1 addition & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -12,6 +12,7 @@ Functionality
12
12
* **Multi Objective Support**: Allow searchers to optimize for more than one objective (eg. accuracy and runtime).
13
13
* **Transfer Learning Support**: Allow searchers to use a pretrained model or an observation matcher as a starting point for tuning.
14
14
* **Local Search**: Expected Improvement sampler currently only performs one off configuration scoring. Local search (where a local neighbourhood around the initial EI optimum is explored as a second pass refinement) can significantly improve performance.
15
+
* **Hierarchical Hyperparameters**: Improved handling for hierarchical hyperparameter spaces (currently supported, via flattening of the hyperparameters, but potentially suboptimal for surrogate learning)
0 commit comments