Replies: 2 comments 1 reply
-
If the random search is, well, random. You have no assurance that setting number of trials >= number of combinations will result in all combinations being tested. Just that the probability of any specific combination not being tested will approach to zero as the number of trials approaches infinity. Think about it like a classic dice, or coinflip experiment. The probability of heads is 0.5. But that doesn't mean you can't get ten tails in a row. It's just the compound probability of such event that is low. In any case, grid search could be implemented. We already know the search space so it could be done. |
Beta Was this translation helpful? Give feedback.
-
I think this is an important issue. We should consider implementing a grid search tuner. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I often use keras tuner because of its flexibility. Subclassing the tuner class give u a great extent of flexibility during hyperparameter searching process.
The problem is that I need to search through all the combinations in the search space but when using tuners like randomsearch with max_trials >= number of combinations, it doesn't go through all the combinations. I believe that this happens because the tuner picks some amount of combinations from the search space and if all the picks were already picked before, the oracle triggers exit.
So would there be any way to implement a gridsearch-like tuner using keras-tuner??
I am curious about how to implement a custom tuner other than randomsearch/bayesianoptimization/hyperband within the overall keras-tuner framework.
Beta Was this translation helpful? Give feedback.
All reactions