Skip to content

Commit 5c00695

Browse files
authored
Add README section for hyperparameter tuning jobs (#216)
1 parent 524a8ce commit 5c00695

File tree

1 file changed

+52
-1
lines changed

1 file changed

+52
-1
lines changed

README.rst

Lines changed: 52 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,8 @@ Table of Contents
3030
5. `Chainer SageMaker Estimators <#chainer-sagemaker-estimators>`__
3131
6. `AWS SageMaker Estimators <#aws-sagemaker-estimators>`__
3232
7. `BYO Docker Containers with SageMaker Estimators <#byo-docker-containers-with-sagemaker-estimators>`__
33-
8. `BYO Model <#byo-model>`__
33+
8. `SageMaker Automatic Model Tuning <#sagmaker-automatic-model-tuning>`__
34+
9. `BYO Model <#byo-model>`__
3435

3536

3637
Getting SageMaker Python SDK
@@ -263,6 +264,56 @@ Please refer to the full example in the examples repo:
263264
The example notebook is is located here:
264265
``advanced_functionality/scikit_bring_your_own/scikit_bring_your_own.ipynb``
265266

267+
268+
SageMaker Automatic Model Tuning
269+
--------------------------------
270+
271+
All of the estimators can be used with SageMaker Automatic Model Tuning, which performs hyperparameter tuning jobs. A hyperparameter tuning job runs multiple training jobs that differ by their hyperparameters to find the best one. The SageMaker Python SDK contains a ``HyperparameterTuner`` class for creating and interacting with hyperparameter training jobs. You can read more about SageMaker Automatic Model Tuning in the `AWS documentation <https://docs.aws.amazon.com/sagemaker/latest/dg/automatic-model-tuning.html>`__.
272+
273+
Here is a basic example of how to use ``HyperparameterTuner`` to start tuning jobs instead of using an estimator to start training jobs:
274+
275+
.. code:: python
276+
277+
from sagemaker.tuner import HyperparameterTuner, ContinuousParameter
278+
279+
# Configure HyperparameterTuner
280+
my_tuner = HyperparameterTuner(estimator=my_estimator, # previously-configured Estimator object
281+
objective_metric_name='validation-accuracy',
282+
hyperparameter_ranges={'learning-rate': ContinuousParameter(0.05, 0.06)},
283+
metric_definitions=[{'Name': 'validation-accuracy', 'Regex': 'validation-accuracy=(\d\.\d+)'}],
284+
max_jobs=100,
285+
max_parallel_jobs=10)
286+
287+
# Start hyperparameter tuning job
288+
my_tuner.fit({'train': 's3://my_bucket/my_training_data', 'test': 's3://my_bucket_my_testing_data'})
289+
290+
# Deploy best model
291+
my_predictor = my_tuner.deploy(initial_instance_count=1, instance_type='ml.m4.xlarge')
292+
293+
# Make a prediction against the SageMaker endpoint
294+
response = my_predictor.predict(my_prediction_data)
295+
296+
# Tear down the SageMaker endpoint
297+
my_tuner.delete_endpoint()
298+
299+
There is also an analytics object with each ``HyperparameterTuner`` instance, which presents useful information about the hyperparameter tuning job, like a pandas dataframe summarizing the associated training jobs:
300+
301+
.. code:: python
302+
303+
# Retrieve analytics object
304+
my_tuner_analytics = my_tuner.analytics()
305+
306+
# Look at summary of associated training jobs
307+
my_dataframe = my_tuner_analytics.dataframe()
308+
309+
For more detailed examples of running hyperparameter tuning jobs, see: https://github.com/awslabs/amazon-sagemaker-examples.
310+
311+
For more detailed explanations of the classes mentioned, see:
312+
313+
- `API docs for HyperparameterTuner and parameter range classes <https://sagemaker.readthedocs.io/en/latest/tuner.html>`__.
314+
- `API docs for analytics classes <https://sagemaker.readthedocs.io/en/latest/analytics.html>`__.
315+
316+
266317
FAQ
267318
---
268319

0 commit comments

Comments
 (0)