You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/data-science-virtual-machine/dsvm-pools.md
+3-17Lines changed: 3 additions & 17 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,7 +11,6 @@ author: vijetajo
11
11
ms.author: vijetaj
12
12
ms.topic: conceptual
13
13
ms.date: 12/10/2018
14
-
15
14
---
16
15
17
16
# Create a shared pool of Data Science Virtual Machines
@@ -32,11 +31,13 @@ You can find a sample Azure Resource Manager template that creates a scale set w
32
31
33
32
You can create the scale set from the Azure Resource Manager template by specifying values for the parameter file in the Azure CLI:
34
33
35
-
```
34
+
```azurecli-interactive
36
35
az group create --name [[NAME OF RESOURCE GROUP]] --location [[ Data center. For eg: "West US 2"]
37
36
az group deployment create --resource-group [[NAME OF RESOURCE GROUP ABOVE]] --template-uri https://raw.githubusercontent.com/Azure/DataScienceVM/master/Scripts/CreateDSVM/Ubuntu/dsvm-vmss-cluster.json --parameters @[[PARAMETER JSON FILE]]
38
37
```
38
+
39
39
The preceding commands assume you have:
40
+
40
41
* A copy of the parameter file with the values specified for your instance of the scale set.
41
42
* The number of VM instances.
42
43
* Pointers to the Azure Files share.
@@ -54,18 +55,3 @@ Virtual machine scale sets support autoscaling. You can set rules about when to
54
55
55
56
*[Set up a common Identity](dsvm-common-identity.md)
56
57
*[Securely store credentials to access cloud resources](dsvm-secure-access-keys.md)
Copy file name to clipboardExpand all lines: articles/machine-learning/data-science-virtual-machine/dsvm-secure-access-keys.md
+8-8Lines changed: 8 additions & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -23,10 +23,9 @@ One way to secure credentials is to use Windows Installer (MSI) in combination w
23
23
24
24
The documentation about managed identities for Azure resources and Key Vault comprises a comprehensive resource for in-depth information on these services. The rest of this article walks through the basic use of MSI and Key Vault on the Data Science Virtual Machine (DSVM) to access Azure resources.
25
25
26
-
## Create a managed identity on the DSVM
26
+
## Create a managed identity on the DSVM
27
27
28
-
29
-
```
28
+
```azurecli-interactive
30
29
# Prerequisite: You have already created a Data Science VM in the usual way.
31
30
32
31
# Create an identity principal for the VM.
@@ -35,9 +34,9 @@ az vm assign-identity -g <Resource Group Name> -n <Name of the VM>
35
34
az resource list -n <Name of the VM> --query [*].identity.principalId --out tsv
36
35
```
37
36
38
-
39
37
## Assign Key Vault access permissions to a VM principal
40
-
```
38
+
39
+
```azurecli-interactive
41
40
# Prerequisite: You have already created an empty Key Vault resource on Azure by using the Azure portal or Azure CLI.
42
41
43
42
# Assign only get and set permissions but not the capability to list the keys.
@@ -46,7 +45,7 @@ az keyvault set-policy --object-id <Principal ID of the DSVM from previous step>
# Prerequisite: You have granted your VMs MSI access to use storage account access keys based on instructions at https://docs.microsoft.com/azure/active-directory/managed-service-identity/tutorial-linux-vm-access-storage. This article describes the process in more detail.
# Now you can access the data in the storage account from the retrieved storage account keys.
68
67
```
68
+
69
69
## Access the key vault from Python
70
70
71
71
```python
@@ -97,7 +97,7 @@ print("My secret value is {}".format(secret.value))
97
97
98
98
## Access the key vault from Azure CLI
99
99
100
-
```
100
+
```azurecli-interactive
101
101
# With managed identities for Azure resources set up on the DSVM, users on the DSVM can use Azure CLI to perform the authorized functions. The following commands enable access to the key vault from Azure CLI without requiring login to an Azure account.
102
102
# Prerequisites: MSI is already set up on the DSVM as indicated earlier. Specific permissions, like accessing storage account keys, reading specific secrets, and writing new secrets, are provided to the MSI.
Copy file name to clipboardExpand all lines: articles/machine-learning/how-to-configure-environment.md
+15-20Lines changed: 15 additions & 20 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -27,7 +27,6 @@ The following table shows each development environment covered in this article,
27
27
|[Azure Databricks](#aml-databricks)| Ideal for running large-scale intensive machine learning workflows on the scalable Apache Spark platform. | Overkill for experimental machine learning, or smaller-scale experiments and workflows. Additional cost incurred for Azure Databricks. See [pricing details](https://azure.microsoft.com/pricing/details/databricks/). |
28
28
|[The Data Science Virtual Machine (DSVM)](#dsvm)| Similar to the cloud-based compute instance (Python and the SDK are pre-installed), but with additional popular data science and machine learning tools pre-installed. Easy to scale and combine with other custom tools and workflows. | A slower getting started experience compared to the cloud-based compute instance. |
29
29
30
-
31
30
This article also provides additional usage tips for the following tools:
32
31
33
32
*[Jupyter Notebooks](#jupyter): If you're already using the Jupyter Notebook, the SDK has some extras that you should install.
There is nothing to install or configure for a compute instance. Create one anytime from within your Azure Machine Learning workspace. Provide just a name and specify an Azure VM type. Try it now with this [Tutorial: Setup environment and workspace](tutorial-1st-experiment-sdk-setup.md).
61
60
62
-
63
61
Learn more about [compute instances](concept-compute-instance.md).
64
62
65
63
To stop incurring compute charges, [stop the compute instance](tutorial-1st-experiment-sdk-train.md#clean-up-resources).
@@ -95,7 +93,7 @@ To use the DSVM as a development environment:
95
93
96
94
* To create an Ubuntu Data Science Virtual Machine, use the following command:
97
95
98
-
```azurecli
96
+
```azurecli-interactive
99
97
# create a Ubuntu DSVM in your resource group
100
98
# note you need to be at least a contributor to the resource group in order to execute this command successfully
101
99
# If you need to create a new resource group use: "az group create --name YOUR-RESOURCE-GROUP-NAME --location YOUR-REGION (For example: westus2)"
@@ -104,7 +102,7 @@ To use the DSVM as a development environment:
104
102
105
103
* To create a Windows Data Science Virtual Machine, use the following command:
106
104
107
-
```azurecli
105
+
```azurecli-interactive
108
106
# create a Windows Server 2016 DSVM in your resource group
109
107
# note you need to be at least a contributor to the resource group in order to execute this command successfully
110
108
az vm create --resource-group YOUR-RESOURCE-GROUP-NAME --name YOUR-VM-NAME --image microsoft-dsvm:dsvm-windows:server-2016:latest --admin-username YOUR-USERNAME --admin-password YOUR-PASSWORD --authentication-type password
@@ -114,13 +112,13 @@ To use the DSVM as a development environment:
114
112
115
113
* For Ubuntu DSVM:
116
114
117
-
```shell
115
+
```cmd
118
116
conda activate py36
119
117
```
120
118
121
119
* For Windows DSVM:
122
120
123
-
```shell
121
+
```cmd
124
122
conda activate AzureML
125
123
```
126
124
@@ -145,35 +143,35 @@ When you're using a local computer (which might also be a remote virtual machine
145
143
146
144
Run the following command to create the environment.
147
145
148
-
```shell
146
+
```cmd
149
147
conda create -n myenv python=3.6.5
150
148
```
151
149
152
150
Then activate the environment.
153
151
154
-
```shell
152
+
```cmd
155
153
conda activate myenv
156
154
```
157
155
158
156
This example creates an environment using python 3.6.5, but any specific subversions can be chosen. SDK compatibility may not be guaranteed with certain major versions (3.5+ is recommended), and it's recommended to try a different version/subversion in your Anaconda environment if you run into errors. It will take several minutes to create the environment while components and packages are downloaded.
159
157
160
158
1. Run the following commands in your new environment to enable environment-specific IPython kernels. This will ensure expected kernel and package import behavior when working with Jupyter Notebooks within Anaconda environments:
161
159
162
-
```shell
160
+
```cmd
163
161
conda install notebook ipykernel
164
162
```
165
163
166
164
Then run the following command to create the kernel:
1. Use the following commands to install packages:
173
171
174
172
This command installs the base Azure Machine Learning SDK with notebook and `automl` extras. The `automl` extra is a large install, and can be removed from the brackets if you don't intend to run automated machine learning experiments. The `automl` extra also includes the Azure Machine Learning Data Prep SDK by default as a dependency.
175
173
176
-
```shell
174
+
```cmd
177
175
pip install azureml-sdk[notebooks,automl]
178
176
```
179
177
@@ -186,20 +184,19 @@ When you're using a local computer (which might also be a remote virtual machine
It will take several minutes to install the SDK. For more information on installation options, see the [install guide](https://docs.microsoft.com/python/api/overview/azure/ml/install?view=azure-ml-py).
191
188
192
189
1. Install other packages for your machine learning experimentation.
193
190
194
191
Use either of the following commands and replace *\<new package>* with the package you want to install. Installing packages via `conda install` requires that the package is part of the current channels (new channels can be added in Anaconda Cloud).
195
192
196
-
```shell
193
+
```cmd
197
194
conda install <new package>
198
195
```
199
196
200
197
Alternatively, you can install packages via `pip`.
201
198
202
-
```shell
199
+
```cmd
203
200
pip install <new package>
204
201
```
205
202
@@ -213,19 +210,19 @@ To enable these components in your Jupyter Notebook environment:
213
210
214
211
1. Open an Anaconda prompt and activate your environment.
215
212
216
-
```shell
213
+
```cmd
217
214
conda activate myenv
218
215
```
219
216
220
217
1. Clone [the GitHub repository](https://aka.ms/aml-notebooks) for a set of sample notebooks.
1. Launch the Jupyter Notebook server with the following command:
227
224
228
-
```shell
225
+
```cmd
229
226
jupyter notebook
230
227
```
231
228
@@ -245,7 +242,6 @@ To enable these components in your Jupyter Notebook environment:
245
242
246
243
1. To configure the Jupyter Notebook to use your Azure Machine Learning workspace, go to the [Create a workspace configuration file](#workspace) section.
247
244
248
-
249
245
### <a id="vscode"></a>Visual Studio Code
250
246
251
247
Visual Studio Code is a very popular cross platform code editor that supports an extensive set of programming languages and tools through extensions available in the [Visual Studio marketplace](https://marketplace.visualstudio.com/vscode). The [Azure Machine Learning extension](https://marketplace.visualstudio.com/items?itemName=ms-toolsai.vscode-ai) installs the [Python extension](https://marketplace.visualstudio.com/items?itemName=ms-python.python) for coding in all types of Python environments (virtual, Anaconda, etc.). In addition, it provides convenience features for working with Azure Machine Learning resources and running Azure Machine Learning experiments all without leaving Visual Studio Code.
@@ -328,7 +324,7 @@ Once the cluster is running, [create a library](https://docs.databricks.com/user
328
324
+ In AutoML config, when using Azure Databricks add the following parameters:
329
325
1. ```max_concurrent_iterations``` is based on number of worker nodes in your cluster.
330
326
2. ```spark_context=sc``` is based on the default spark context.
331
-
+ Or, if you have an old SDK version, deselect it from cluster’s installed libs and move to trash. Install the new SDK version and restart the cluster. If there is an issue after the restart, detach and reattach your cluster.
327
+
+ Or, if you have an old SDK version, deselect it from cluster's installed libs and move to trash. Install the new SDK version and restart the cluster. If there is an issue after the restart, detach and reattach your cluster.
332
328
333
329
If install was successful, the imported library should look like one of these:
334
330
@@ -392,7 +388,6 @@ You can create the configuration file in three ways:
392
388
393
389
This code writes the configuration file to the *.azureml/config.json*file.
394
390
395
-
396
391
## Next steps
397
392
398
393
- [Train a model](tutorial-train-models-with-aml.md) on Azure Machine Learning with the MNIST dataset
This article provides an introduction to field-programmable gate arrays (FPGA), and shows you how to deploy your models using Azure Machine Learning to an Azure FPGA.
20
+
This article provides an introduction to field-programmable gate arrays (FPGA), and shows you how to deploy your models using Azure Machine Learning to an Azure FPGA.
21
21
22
22
FPGAs contain an array of programmable logic blocks, and a hierarchy of reconfigurable interconnects. The interconnects allow these blocks to be configured in various ways after manufacturing. Compared to other chips, FPGAs provide a combination of programmability and performance.
23
23
@@ -48,7 +48,7 @@ FPGAs on Azure supports:
48
48
49
49
+ Image classification and recognition scenarios
50
50
+ TensorFlow deployment
51
-
+ Intel FPGA hardware
51
+
+ Intel FPGA hardware
52
52
53
53
These DNN models are currently available:
54
54
- ResNet 50
@@ -77,20 +77,17 @@ The following scenarios use FPGAs:
You can deploy a model as a web service on FPGAs with Azure Machine Learning Hardware Accelerated Models. Using FPGAs provides ultra-low latency inference, even with a single batch size. Inference, or model scoring, is the phase where the deployed model is used for prediction, most commonly on production data.
85
83
86
-
87
84
### Prerequisites
88
85
89
86
- An Azure subscription. If you do not have one, create a free account before you begin. Try the [free or paid version of Azure Machine Learning](https://aka.ms/AMLFree) today.
90
87
91
88
- FPGA quota. Use the Azure CLI to check whether you have quota:
92
89
93
-
```shell
90
+
```azurecli-interactive
94
91
az vm list-usage --location "eastus" -o table --query "[?localName=='Standard PBS Family vCPUs']"
95
92
```
96
93
@@ -113,7 +110,7 @@ You can deploy a model as a web service on FPGAs with Azure Machine Learning Har
113
110
114
111
- The Python SDK for hardware-accelerated models:
115
112
116
-
```shell
113
+
```cmd
117
114
pip install --upgrade azureml-accel-models
118
115
```
119
116
@@ -321,7 +318,7 @@ To deploy your model as a high-scale production web service, use Azure Kubernete
321
318
```python
322
319
from azureml.core.compute import AksCompute, ComputeTarget
323
320
324
-
# Specify the Standard_PB6s Azure VM and location. Values for location may be "eastus", "southeastasia", "westeurope", or "westus2”. If no value is specified, the default is "eastus".
321
+
# Specify the Standard_PB6s Azure VM and location. Values for location may be "eastus", "southeastasia", "westeurope", or "westus2". If no value is specified, the default is "eastus".
Copy file name to clipboardExpand all lines: articles/machine-learning/how-to-manage-workspace-cli.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -32,7 +32,7 @@ In this article, you learn how to create an Azure Machine Learning workspace usi
32
32
33
33
There are several ways that you can authenticate to your Azure subscription from the CLI. The most basic is to interactively authenticate using a browser. To authenticate interactively, open a command line or terminal and use the following command:
34
34
35
-
```azurecli
35
+
```azurecli-interactive
36
36
az login
37
37
```
38
38
@@ -143,13 +143,13 @@ To create a workspace that uses existing resources, you must provide the ID for
143
143
144
144
1. Install the application insights extension:
145
145
146
-
```bash
146
+
```azurecli-interactive
147
147
az extension add -n application-insights
148
148
```
149
149
150
150
2. Get the ID of your application insight service:
151
151
152
-
```bash
152
+
```azurecli-interactive
153
153
az monitor app-insights component show --app <application-insight-name> -g <resource-group-name> --query "id"
0 commit comments