Skip to content

Commit 49376b5

Browse files
authored
Update HYPERTUNING.md to include TPE strategy
Added TPE (Tree-structured Parzen Estimator) strategy to hyperparameter tuning documentation.
1 parent 8f95bb4 commit 49376b5

File tree

1 file changed

+4
-2
lines changed

1 file changed

+4
-2
lines changed

docs/HYPERTUNING.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,13 +2,15 @@
22

33
The `ann_hypertune` module provides automated hyperparameter search to find
44
optimal network configurations. It supports **grid search** (exhaustive),
5-
**random search** (sampling-based), and **Bayesian optimization** (intelligent) strategies.
5+
**random search** (sampling-based), **Bayesian optimization** (intelligent),
6+
and **TPE** (Tree-structured Parzen Estimator) strategies.
67

78
## Features
89

910
- **Grid Search** - exhaustively tries all combinations of hyperparameters
1011
- **Random Search** - randomly samples from the hyperparameter space
1112
- **Bayesian Optimization** - intelligent search using Gaussian Process surrogate
13+
- **TPE** - Tree-structured Parzen Estimator
1214
- **Topology Patterns** - automatic layer size generation (pyramid, funnel, etc.)
1315
- **Per-Layer Activations** - different activation function for each layer
1416
- **Data Splitting** - automatic train/validation holdout with optional shuffling
@@ -208,7 +210,7 @@ int trials = hypertune_bayesian_search(
208210
- Smooth objective function
209211
- 2-3 continuous hyperparameters
210212

211-
## TPE (Tree-structured Parzen Estimator)
213+
## TPE (Tree-structured Parzen Estimator) Optimization
212214

213215
TPE is an alternative Bayesian optimization method that handles mixed
214216
categorical and continuous parameters better than GP-BO:

0 commit comments

Comments
 (0)