You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
All of the estimators can be used with SageMaker Automatic Model Tuning, which performs hyperparameter tuning jobs. A hyperparameter tuning job runs multiple training jobs that differ by their hyperparameters to find the best one. The SageMaker Python SDK contains a ``HyperparameterTuner`` class for creating and interacting with hyperparameter training jobs. You can read more about SageMaker Automatic Model Tuning in the `AWS documentation <https://docs.aws.amazon.com/sagemaker/latest/dg/automatic-model-tuning.html>`__.
272
+
273
+
Here is a basic example of how to use ``HyperparameterTuner`` to start tuning jobs instead of using an estimator to start training jobs:
274
+
275
+
.. code:: python
276
+
277
+
from sagemaker.tuner import HyperparameterTuner, ContinuousParameter
There is also an analytics object with each ``HyperparameterTuner`` instance, which presents useful information about the hyperparameter tuning job, like a pandas dataframe summarizing the associated training jobs:
300
+
301
+
.. code:: python
302
+
303
+
# Retrieve analytics object
304
+
my_tuner_analytics = my_tuner.analytics()
305
+
306
+
# Look at summary of associated training jobs
307
+
my_dataframe = my_tuner_analytics.dataframe()
308
+
309
+
For more detailed examples of running hyperparameter tuning jobs, see: https://github.com/awslabs/amazon-sagemaker-examples.
310
+
311
+
For more detailed explanations of the classes mentioned, see:
312
+
313
+
- `API docs for HyperparameterTuner and parameter range classes <https://sagemaker.readthedocs.io/en/latest/tuner.html>`__.
314
+
- `API docs for analytics classes <https://sagemaker.readthedocs.io/en/latest/analytics.html>`__.
0 commit comments