Skip to content

Commit d4765c6

Browse files
authored
Merge pull request #107036 from msebolt/azurecli-update-pr2
azurecli updates
2 parents 8fd1f80 + 08e70f5 commit d4765c6

13 files changed

+92
-123
lines changed

articles/machine-learning/data-science-virtual-machine/dsvm-pools.md

Lines changed: 3 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,6 @@ author: vijetajo
1111
ms.author: vijetaj
1212
ms.topic: conceptual
1313
ms.date: 12/10/2018
14-
1514
---
1615

1716
# Create a shared pool of Data Science Virtual Machines
@@ -32,11 +31,13 @@ You can find a sample Azure Resource Manager template that creates a scale set w
3231

3332
You can create the scale set from the Azure Resource Manager template by specifying values for the parameter file in the Azure CLI:
3433

35-
```
34+
```azurecli-interactive
3635
az group create --name [[NAME OF RESOURCE GROUP]] --location [[ Data center. For eg: "West US 2"]
3736
az group deployment create --resource-group [[NAME OF RESOURCE GROUP ABOVE]] --template-uri https://raw.githubusercontent.com/Azure/DataScienceVM/master/Scripts/CreateDSVM/Ubuntu/dsvm-vmss-cluster.json --parameters @[[PARAMETER JSON FILE]]
3837
```
38+
3939
The preceding commands assume you have:
40+
4041
* A copy of the parameter file with the values specified for your instance of the scale set.
4142
* The number of VM instances.
4243
* Pointers to the Azure Files share.
@@ -54,18 +55,3 @@ Virtual machine scale sets support autoscaling. You can set rules about when to
5455

5556
* [Set up a common Identity](dsvm-common-identity.md)
5657
* [Securely store credentials to access cloud resources](dsvm-secure-access-keys.md)
57-
58-
59-
60-
61-
62-
63-
64-
65-
66-
67-
68-
69-
70-
71-

articles/machine-learning/data-science-virtual-machine/dsvm-secure-access-keys.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -23,10 +23,9 @@ One way to secure credentials is to use Windows Installer (MSI) in combination w
2323

2424
The documentation about managed identities for Azure resources and Key Vault comprises a comprehensive resource for in-depth information on these services. The rest of this article walks through the basic use of MSI and Key Vault on the Data Science Virtual Machine (DSVM) to access Azure resources.
2525

26-
## Create a managed identity on the DSVM
26+
## Create a managed identity on the DSVM
2727

28-
29-
```
28+
```azurecli-interactive
3029
# Prerequisite: You have already created a Data Science VM in the usual way.
3130
3231
# Create an identity principal for the VM.
@@ -35,9 +34,9 @@ az vm assign-identity -g <Resource Group Name> -n <Name of the VM>
3534
az resource list -n <Name of the VM> --query [*].identity.principalId --out tsv
3635
```
3736

38-
3937
## Assign Key Vault access permissions to a VM principal
40-
```
38+
39+
```azurecli-interactive
4140
# Prerequisite: You have already created an empty Key Vault resource on Azure by using the Azure portal or Azure CLI.
4241
4342
# Assign only get and set permissions but not the capability to list the keys.
@@ -46,7 +45,7 @@ az keyvault set-policy --object-id <Principal ID of the DSVM from previous step>
4645

4746
## Access a secret in the key vault from the DSVM
4847

49-
```
48+
```bash
5049
# Get the access token for the VM.
5150
x=`curl http://localhost:50342/oauth2/token --data "resource=https://vault.azure.net" -H Metadata:true`
5251
token=`echo $x | python -c "import sys, json; print(json.load(sys.stdin)['access_token'])"`
@@ -57,7 +56,7 @@ curl https://<Vault Name>.vault.azure.net/secrets/SQLPasswd?api-version=2016-10-
5756

5857
## Access storage keys from the DSVM
5958

60-
```
59+
```bash
6160
# Prerequisite: You have granted your VMs MSI access to use storage account access keys based on instructions at https://docs.microsoft.com/azure/active-directory/managed-service-identity/tutorial-linux-vm-access-storage. This article describes the process in more detail.
6261

6362
y=`curl http://localhost:50342/oauth2/token --data "resource=https://management.azure.com/" -H Metadata:true`
@@ -66,6 +65,7 @@ curl https://management.azure.com/subscriptions/<SubscriptionID>/resourceGroups/
6665

6766
# Now you can access the data in the storage account from the retrieved storage account keys.
6867
```
68+
6969
## Access the key vault from Python
7070

7171
```python
@@ -97,7 +97,7 @@ print("My secret value is {}".format(secret.value))
9797

9898
## Access the key vault from Azure CLI
9999

100-
```
100+
```azurecli-interactive
101101
# With managed identities for Azure resources set up on the DSVM, users on the DSVM can use Azure CLI to perform the authorized functions. The following commands enable access to the key vault from Azure CLI without requiring login to an Azure account.
102102
# Prerequisites: MSI is already set up on the DSVM as indicated earlier. Specific permissions, like accessing storage account keys, reading specific secrets, and writing new secrets, are provided to the MSI.
103103

articles/machine-learning/how-to-configure-environment.md

Lines changed: 15 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,6 @@ The following table shows each development environment covered in this article,
2727
| [Azure Databricks](#aml-databricks) | Ideal for running large-scale intensive machine learning workflows on the scalable Apache Spark platform. | Overkill for experimental machine learning, or smaller-scale experiments and workflows. Additional cost incurred for Azure Databricks. See [pricing details](https://azure.microsoft.com/pricing/details/databricks/). |
2828
| [The Data Science Virtual Machine (DSVM)](#dsvm) | Similar to the cloud-based compute instance (Python and the SDK are pre-installed), but with additional popular data science and machine learning tools pre-installed. Easy to scale and combine with other custom tools and workflows. | A slower getting started experience compared to the cloud-based compute instance. |
2929

30-
3130
This article also provides additional usage tips for the following tools:
3231

3332
* [Jupyter Notebooks](#jupyter): If you're already using the Jupyter Notebook, the SDK has some extras that you should install.
@@ -55,7 +54,6 @@ The Azure Machine Learning [compute instance (preview)](concept-compute-instance
5554

5655
There is nothing to install or configure for a compute instance. Create one anytime from within your Azure Machine Learning workspace. Provide just a name and specify an Azure VM type. Try it now with this [Tutorial: Setup environment and workspace](tutorial-1st-experiment-sdk-setup.md).
5756

58-
5957
Learn more about [compute instances](concept-compute-instance.md).
6058

6159
To stop incurring compute charges, [stop the compute instance](tutorial-1st-experiment-sdk-train.md#clean-up-resources).
@@ -91,7 +89,7 @@ To use the DSVM as a development environment:
9189
9290
* To create an Ubuntu Data Science Virtual Machine, use the following command:
9391

94-
```azurecli
92+
```azurecli-interactive
9593
# create a Ubuntu DSVM in your resource group
9694
# note you need to be at least a contributor to the resource group in order to execute this command successfully
9795
# If you need to create a new resource group use: "az group create --name YOUR-RESOURCE-GROUP-NAME --location YOUR-REGION (For example: westus2)"
@@ -100,7 +98,7 @@ To use the DSVM as a development environment:
10098

10199
* To create a Windows Data Science Virtual Machine, use the following command:
102100

103-
```azurecli
101+
```azurecli-interactive
104102
# create a Windows Server 2016 DSVM in your resource group
105103
# note you need to be at least a contributor to the resource group in order to execute this command successfully
106104
az vm create --resource-group YOUR-RESOURCE-GROUP-NAME --name YOUR-VM-NAME --image microsoft-dsvm:dsvm-windows:server-2016:latest --admin-username YOUR-USERNAME --admin-password YOUR-PASSWORD --authentication-type password
@@ -110,13 +108,13 @@ To use the DSVM as a development environment:
110108
111109
* For Ubuntu DSVM:
112110
113-
```shell
111+
```bash
114112
conda activate py36
115113
```
116114
117115
* For Windows DSVM:
118116
119-
```shell
117+
```bash
120118
conda activate AzureML
121119
```
122120
@@ -141,35 +139,35 @@ When you're using a local computer (which might also be a remote virtual machine
141139
142140
Run the following command to create the environment.
143141
144-
```shell
142+
```bash
145143
conda create -n myenv python=3.6.5
146144
```
147145
148146
Then activate the environment.
149147
150-
```shell
148+
```bash
151149
conda activate myenv
152150
```
153151
154152
This example creates an environment using python 3.6.5, but any specific subversions can be chosen. SDK compatibility may not be guaranteed with certain major versions (3.5+ is recommended), and it's recommended to try a different version/subversion in your Anaconda environment if you run into errors. It will take several minutes to create the environment while components and packages are downloaded.
155153
156154
1. Run the following commands in your new environment to enable environment-specific IPython kernels. This will ensure expected kernel and package import behavior when working with Jupyter Notebooks within Anaconda environments:
157155
158-
```shell
156+
```bash
159157
conda install notebook ipykernel
160158
```
161159
162160
Then run the following command to create the kernel:
163161
164-
```shell
162+
```bash
165163
ipython kernel install --user --name myenv --display-name "Python (myenv)"
166164
```
167165
168166
1. Use the following commands to install packages:
169167
170168
This command installs the base Azure Machine Learning SDK with notebook and `automl` extras. The `automl` extra is a large install, and can be removed from the brackets if you don't intend to run automated machine learning experiments. The `automl` extra also includes the Azure Machine Learning Data Prep SDK by default as a dependency.
171169
172-
```shell
170+
```bash
173171
pip install azureml-sdk[notebooks,automl]
174172
```
175173
@@ -182,20 +180,19 @@ When you're using a local computer (which might also be a remote virtual machine
182180
>
183181
> `pip install --upgrade azureml-sdk\[notebooks,automl\]`
184182
185-
186183
It will take several minutes to install the SDK. For more information on installation options, see the [install guide](https://docs.microsoft.com/python/api/overview/azure/ml/install?view=azure-ml-py).
187184
188185
1. Install other packages for your machine learning experimentation.
189186
190187
Use either of the following commands and replace *\<new package>* with the package you want to install. Installing packages via `conda install` requires that the package is part of the current channels (new channels can be added in Anaconda Cloud).
191188
192-
```shell
189+
```bash
193190
conda install <new package>
194191
```
195192
196193
Alternatively, you can install packages via `pip`.
197194
198-
```shell
195+
```bash
199196
pip install <new package>
200197
```
201198
@@ -209,19 +206,19 @@ To enable these components in your Jupyter Notebook environment:
209206
210207
1. Open an Anaconda prompt and activate your environment.
211208
212-
```shell
209+
```bash
213210
conda activate myenv
214211
```
215212
216213
1. Clone [the GitHub repository](https://aka.ms/aml-notebooks) for a set of sample notebooks.
217214
218-
```CLI
215+
```bash
219216
git clone https://github.com/Azure/MachineLearningNotebooks.git
220217
```
221218
222219
1. Launch the Jupyter Notebook server with the following command:
223220
224-
```shell
221+
```bash
225222
jupyter notebook
226223
```
227224
@@ -241,7 +238,6 @@ To enable these components in your Jupyter Notebook environment:
241238
242239
1. To configure the Jupyter Notebook to use your Azure Machine Learning workspace, go to the [Create a workspace configuration file](#workspace) section.
243240
244-
245241
### <a id="vscode"></a>Visual Studio Code
246242
247243
Visual Studio Code is a very popular cross platform code editor that supports an extensive set of programming languages and tools through extensions available in the [Visual Studio marketplace](https://marketplace.visualstudio.com/vscode). The [Azure Machine Learning extension](https://marketplace.visualstudio.com/items?itemName=ms-toolsai.vscode-ai) installs the [Python extension](https://marketplace.visualstudio.com/items?itemName=ms-python.python) for coding in all types of Python environments (virtual, Anaconda, etc.). In addition, it provides convenience features for working with Azure Machine Learning resources and running Azure Machine Learning experiments all without leaving Visual Studio Code.
@@ -324,7 +320,7 @@ Once the cluster is running, [create a library](https://docs.databricks.com/user
324320
+ In AutoML config, when using Azure Databricks add the following parameters:
325321
1. ```max_concurrent_iterations``` is based on number of worker nodes in your cluster.
326322
2. ```spark_context=sc``` is based on the default spark context.
327-
+ Or, if you have an old SDK version, deselect it from clusters installed libs and move to trash. Install the new SDK version and restart the cluster. If there is an issue after the restart, detach and reattach your cluster.
323+
+ Or, if you have an old SDK version, deselect it from cluster's installed libs and move to trash. Install the new SDK version and restart the cluster. If there is an issue after the restart, detach and reattach your cluster.
328324
329325
If install was successful, the imported library should look like one of these:
330326
@@ -388,7 +384,6 @@ You can create the configuration file in three ways:
388384

389385
This code writes the configuration file to the *.azureml/config.json* file.
390386

391-
392387
## Next steps
393388

394389
- [Train a model](tutorial-train-models-with-aml.md) on Azure Machine Learning with the MNIST dataset

articles/machine-learning/how-to-deploy-fpga-web-service.md

Lines changed: 5 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ ms.custom: seodec18
1717
# What are field-programmable gate arrays (FPGA) and how to deploy
1818
[!INCLUDE [applies-to-skus](../../includes/aml-applies-to-basic-enterprise-sku.md)]
1919

20-
This article provides an introduction to field-programmable gate arrays (FPGA), and shows you how to deploy your models using Azure Machine Learning to an Azure FPGA.
20+
This article provides an introduction to field-programmable gate arrays (FPGA), and shows you how to deploy your models using Azure Machine Learning to an Azure FPGA.
2121

2222
FPGAs contain an array of programmable logic blocks, and a hierarchy of reconfigurable interconnects. The interconnects allow these blocks to be configured in various ways after manufacturing. Compared to other chips, FPGAs provide a combination of programmability and performance.
2323

@@ -48,7 +48,7 @@ FPGAs on Azure supports:
4848

4949
+ Image classification and recognition scenarios
5050
+ TensorFlow deployment
51-
+ Intel FPGA hardware
51+
+ Intel FPGA hardware
5252

5353
These DNN models are currently available:
5454
- ResNet 50
@@ -77,20 +77,17 @@ The following scenarios use FPGAs:
7777

7878
+ [Land cover mapping](https://blogs.technet.microsoft.com/machinelearning/2018/05/29/how-to-use-fpgas-for-deep-learning-inference-to-perform-land-cover-mapping-on-terabytes-of-aerial-images/)
7979

80-
81-
82-
## Example: Deploy models on FPGAs
80+
## Example: Deploy models on FPGAs
8381

8482
You can deploy a model as a web service on FPGAs with Azure Machine Learning Hardware Accelerated Models. Using FPGAs provides ultra-low latency inference, even with a single batch size. Inference, or model scoring, is the phase where the deployed model is used for prediction, most commonly on production data.
8583

86-
8784
### Prerequisites
8885

8986
- An Azure subscription. If you do not have one, create a free account before you begin. Try the [free or paid version of Azure Machine Learning](https://aka.ms/AMLFree) today.
9087

9188
- FPGA quota. Use the Azure CLI to check whether you have quota:
9289

93-
```shell
90+
```azurecli-interactive
9491
az vm list-usage --location "eastus" -o table --query "[?localName=='Standard PBS Family vCPUs']"
9592
```
9693
@@ -113,7 +110,7 @@ You can deploy a model as a web service on FPGAs with Azure Machine Learning Har
113110
114111
- The Python SDK for hardware-accelerated models:
115112
116-
```shell
113+
```bash
117114
pip install --upgrade azureml-accel-models
118115
```
119116

articles/machine-learning/how-to-manage-workspace-cli.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ In this article, you learn how to create an Azure Machine Learning workspace usi
3232
3333
There are several ways that you can authenticate to your Azure subscription from the CLI. The most basic is to interactively authenticate using a browser. To authenticate interactively, open a command line or terminal and use the following command:
3434

35-
```azurecli
35+
```azurecli-interactive
3636
az login
3737
```
3838

@@ -146,13 +146,13 @@ To create a workspace that uses existing resources, you must provide the ID for
146146

147147
1. Install the application insights extension:
148148

149-
```bash
149+
```azurecli-interactive
150150
az extension add -n application-insights
151151
```
152152
153153
2. Get the ID of your application insight service:
154154
155-
```bash
155+
```azurecli-interactive
156156
az monitor app-insights component show --app <application-insight-name> -g <resource-group-name> --query "id"
157157
```
158158

articles/machine-learning/resource-known-issues.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -123,7 +123,7 @@ You will not be able to deploy models on FPGAs until you have requested and been
123123
## Automated machine learning
124124

125125
Tensor Flow
126-
Automated machine learning does not currently support tensor flow version 1.13. Installing this version will cause package dependencies to stop working. We are working to fix this issue in a future release.
126+
Automated machine learning does not currently support tensor flow version 1.13. Installing this version will cause package dependencies to stop working. We are working to fix this issue in a future release.
127127

128128
### Experiment Charts
129129

@@ -145,7 +145,7 @@ script_params = {
145145
```
146146

147147
If you don't include the leading forward slash, '/', you'll need to prefix the working directory e.g.
148-
`/mnt/batch/.../tmp/dataset` on the compute target to indicate where you want the dataset to be mounted.
148+
`/mnt/batch/.../tmp/dataset` on the compute target to indicate where you want the dataset to be mounted.
149149

150150
### Fail to read Parquet file from HTTP or ADLS Gen 2
151151

@@ -205,14 +205,14 @@ If you see this error when you use automated machine learning, run the two follo
205205

206206
If you see this error when you use automated machine learning:
207207

208-
1. Run this command to install two packages in your Azure Databricks cluster:
208+
1. Run this command to install two packages in your Azure Databricks cluster:
209209

210-
```
210+
```bash
211211
scikit-learn==0.19.1
212212
pandas==0.22.0
213213
```
214214

215-
1. Detach and then reattach the cluster to your notebook.
215+
1. Detach and then reattach the cluster to your notebook.
216216

217217
If these steps don't solve the issue, try restarting the cluster.
218218

@@ -265,11 +265,11 @@ If you receive an error `Unable to upload project files to working directory in
265265

266266
If you are using file share for other workloads, such as data transfer, the recommendation is to use blobs so that file share is free to be used for submitting runs. You may also split the workload between two different workspaces.
267267

268-
## Webservices in Azure Kubernetes Service failures
268+
## Webservices in Azure Kubernetes Service failures
269269

270270
Many webservice failures in Azure Kubernetes Service can be debugged by connecting to the cluster using `kubectl`. You can get the `kubeconfig.json` for an Azure Kubernetes Service Cluster by running
271271

272-
```bash
272+
```azurecli-interactive
273273
az aks get-credentials -g <rg> -n <aks cluster name>
274274
```
275275

@@ -313,7 +313,7 @@ Known issues with labeling projects.
313313

314314
### Only datasets created on blob datastores can be used
315315

316-
This is a known limitation of the current release.
316+
This is a known limitation of the current release.
317317

318318
### After creation, the project shows "Initializing" for a long time
319319

0 commit comments

Comments
 (0)