You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/how-to-train-scikit-learn.md
+5-5Lines changed: 5 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,7 +8,7 @@ ms.subservice: core
8
8
ms.topic: conceptual
9
9
ms.author: maxluk
10
10
author: maxluk
11
-
ms.date: 08/02/2019
11
+
ms.date: 03/09/2020
12
12
ms.custom: seodec18
13
13
14
14
#Customer intent: As a Python scikit-learn developer, I need to combine open-source with a cloud platform to train, evaluate, and deploy my machine learning models at scale.
@@ -37,7 +37,7 @@ Run this code on either of these environments:
37
37
-[Create a workspace configuration file](how-to-configure-environment.md#workspace).
- You can also find a completed [Jupyter Notebook version](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/ml-frameworks/scikit-learn/training/train-hyperparameter-tune-deploy-with-sklearn/train-hyperparameter-tune-deploy-with-sklearn.ipynb) of this guide on the GitHub samples page. The notebook includes an expanded section covering intelligent hyperparameter tuning and retrieving the best model by primary metrics.
42
42
43
43
## Set up the experiment
@@ -155,7 +155,7 @@ run = experiment.submit(estimator)
155
155
run.wait_for_completion(show_output=True)
156
156
```
157
157
158
-
As the Run is executed, it goes through the following stages:
158
+
As the run is executed, it goes through the following stages:
159
159
160
160
-**Preparing**: A docker image is created according to the TensorFlow estimator. The image is uploaded to the workspace's container registry and cached for later runs. Logs are also streamed to the run history and can be viewed to monitor progress.
161
161
@@ -177,7 +177,7 @@ import joblib
177
177
joblib.dump(svm_model_linear, 'model.joblib')
178
178
```
179
179
180
-
Register the model to your workspace with the following code. By specifying the parameters `model_framework`, `model_framework_version`, and `resource_configuration`, no-code model deployment becomes available. This allows you to directly deploy your model as a web service from the registered model, and the `ResourceConfiguration` object defines the compute resource for the web service.
180
+
Register the model to your workspace with the following code. By specifying the parameters `model_framework`, `model_framework_version`, and `resource_configuration`, no-code model deployment becomes available. This allows you to directly deploy your model as a web service from the registered model, and the [`ResourceConfiguration`](https://docs.microsoft.com/python/api/azureml-core/azureml.core.resource_configuration.resourceconfiguration?view=azure-ml-py) object defines the compute resource for the web service.
181
181
182
182
```Python
183
183
from azureml.core import Model
@@ -197,7 +197,7 @@ contains a section on registering models, but you can skip directly to [creating
197
197
198
198
### (Preview) No-code model deployment
199
199
200
-
Instead of the traditional deployment route, you can also use the no-code deployment feature (preview)for scikit-learn. No-code model deployment is supported for all built-in scikit-learn model types. By registering your model as shown above with the `model_framework`, `model_framework_version`, and `resource_configuration` parameters, you can simply use the `deploy()` static function to deploy your model.
200
+
Instead of the traditional deployment route, you can also use the no-code deployment feature (preview)for scikit-learn. No-code model deployment is supported for all built-in scikit-learn model types. By registering your model as shown above with the `model_framework`, `model_framework_version`, and `resource_configuration` parameters, you can simply use the [`deploy()`](https://docs.microsoft.com/python/api/azureml-core/azureml.core.model%28class%29?view=azure-ml-py#deploy-workspace--name--models--inference-config-none--deployment-config-none--deployment-target-none--overwrite-false-) static function to deploy your model.
Copy file name to clipboardExpand all lines: articles/machine-learning/how-to-train-with-datasets.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,16 +10,16 @@ ms.author: sihhu
10
10
author: MayMSFT
11
11
manager: cgronlun
12
12
ms.reviewer: nibaccam
13
-
ms.date: 09/25/2019
13
+
ms.date: 03/09/2020
14
14
15
-
# Customer intent: As an experienced Python developer, I need to make my data available to my remote compute to train my machine learning models.
15
+
# Customer intent: As an experienced Python developer, I need to make my data available to my local or remote compute to train my machine learning models.
In this article, you learn the two ways to consume [Azure Machine Learning datasets](https://docs.microsoft.com/python/api/azureml-core/azureml.core.dataset%28class%29?view=azure-ml-py) in remote experiment training runs without worrying about connection strings or data paths.
22
+
In this article, you learn the two ways to consume [Azure Machine Learning datasets](https://docs.microsoft.com/python/api/azureml-core/azureml.core.dataset%28class%29?view=azure-ml-py) in a remote experiment training runs without worrying about connection strings or data paths.
23
23
24
24
- Option 1: If you have structured data, create a TabularDataset and use it directly in your training script.
25
25
@@ -31,7 +31,7 @@ Azure Machine Learning datasets provide a seamless integration with Azure Machin
31
31
32
32
To create and train with datasets, you need:
33
33
34
-
* An Azure subscription. If you don’t have an Azure subscription, create a free account before you begin. Try the [free or paid version of Azure Machine Learning](https://aka.ms/AMLFree) today.
34
+
* An Azure subscription. If you don't have an Azure subscription, create a free account before you begin. Try the [free or paid version of Azure Machine Learning](https://aka.ms/AMLFree) today.
35
35
36
36
* An [Azure Machine Learning workspace](how-to-manage-workspace.md).
You can use a dataset as the input and output of each Machine Learning pipeline step. When you rerun pipelines, the output of each pipeline step is registered as a new dataset version.
119
120
120
-
Because Machine Learning pipelines populate the output of each step into a new folder every time the pipeline reruns, the versioned output datasets are reproducible.
121
+
Because Machine Learning pipelines populate the output of each step into a new folder every time the pipeline reruns, the versioned output datasets are reproducible. Learn more about [datasets in pipelines](how-to-create-your-first-pipeline.md#steps).
You can also find the `input_datasets` from experiments by using [Azure Machine Learning studio](https://ml.azure.com/).
170
+
You can also find the `input_datasets` from experiments by using https://ml.azure.com/.
170
171
171
172
The following image shows where to find the input dataset of an experiment on Azure Machine Learning studio. For this example, go to your **Experiments** pane and open the **Properties** tab for a specific run of your experiment, `keras-mnist`.
172
173
@@ -180,7 +181,9 @@ model = run.register_model(model_name='keras-mlp-mnist',
180
181
datasets=[('training data',train_dataset)])
181
182
```
182
183
183
-
After registration, you can see the list of models registered with the dataset by using Python or [Azure Machine Learning studio](https://ml.azure.com/). The following view is from the **Datasets** pane under **Assets**. Select the dataset and then select the **Models** tab for a list of the models that are registered with the dataset.
184
+
After registration, you can see the list of models registered with the dataset by using Python or go to https://ml.azure.com/.
185
+
186
+
The following view is from the **Datasets** pane under **Assets**. Select the dataset and then select the **Models** tab for a list of the models that are registered with the dataset.
0 commit comments