Skip to content

Commit 3702e22

Browse files
committed
Merge branch 'master' of github.com:MicrosoftDocs/azure-docs-pr
2 parents 848001a + c991fd3 commit 3702e22

File tree

6 files changed

+22
-8
lines changed

6 files changed

+22
-8
lines changed

articles/backup/backup-azure-vms-automation.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -316,9 +316,9 @@ Set-AzRecoveryServicesBackupProtectionPolicy -Policy $pol -RetentionPolicy $Ret
316316
> From Az PS version 1.6.0 onwards, one can update the instant restore snapshot retention period in policy using Powershell
317317
318318
````powershell
319-
$bkpPol = Get-AzureRmRecoveryServicesBackupProtectionPolicy -WorkloadType "AzureVM" -VaultId $targetVault.ID
319+
$bkpPol = Get-AzRecoveryServicesBackupProtectionPolicy -WorkloadType "AzureVM" -VaultId $targetVault.ID
320320
$bkpPol.SnapshotRetentionInDays=7
321-
Set-AzureRmRecoveryServicesBackupProtectionPolicy -policy $bkpPol -VaultId $targetVault.ID
321+
Set-AzRecoveryServicesBackupProtectionPolicy -policy $bkpPol -VaultId $targetVault.ID
322322
````
323323

324324
The default value will be 2, user can set the value with a min of 1 and max of 5. For weekly backup policies, the period is set to 5 and cannot be changed.

articles/billing/billing-understand-reserved-instance-usage-ea.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -58,14 +58,14 @@ Other information available in Azure usage data has changed:
5858

5959
You can get the data using the API or download it from Azure portal.
6060

61-
You call the [Usage Details API](/rest/api/consumption/usagedetails/list) with API version "2019-04-01-preview" to get the new data. For details about terminology, see [usage terms](billing-understand-your-usage.md). The caller should be an Enterprise Administrator for the enterprise agreement using the [EA portal](https://ea.azure.com). Read-only Enterprise Administrators can also get the data.
61+
You call the [Usage Details API](/rest/api/consumption/usagedetails/list) to get the new data. For details about terminology, see [usage terms](billing-understand-your-usage.md). The caller should be an Enterprise Administrator for the enterprise agreement using the [EA portal](https://ea.azure.com). Read-only Enterprise Administrators can also get the data.
6262

6363
The data is not available in [Reporting APIs for Enterprise customers - Usage Details](/rest/api/billing/enterprise/billing-enterprise-api-usage-detail).
6464

6565
Here's an example call to the API:
6666

6767
```
68-
https://management.azure.com/providers/Microsoft.Billing/billingAccounts/{enrollmentId}/providers/Microsoft.Billing/billingPeriods/{billingPeriodId}/providers/Microsoft.Consumption/usagedetails?metric={metric}&api-version=2019-04-01-preview&$filter={filter}
68+
https://management.azure.com/providers/Microsoft.Billing/billingAccounts/{enrollmentId}/providers/Microsoft.Billing/billingPeriods/{billingPeriodId}/providers/Microsoft.Consumption/usagedetails?metric={metric}&api-version=2019-05-01&$filter={filter}
6969
```
7070

7171
For more information about {enrollmentId} and {billingPeriodId}, see the [Usage Details – List](https://docs.microsoft.com/rest/api/consumption/usagedetails/list) API article.

articles/event-hubs/event-hubs-java-get-started-send.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -178,11 +178,11 @@ To use EventProcessorHost, you must have an [Azure Storage account][Azure Storag
178178
1. Sign in the [Azure portal](https://portal.azure.com), and select **Create a resource** on the left-hand side of the screen.
179179
2. Select **Storage**, then select **Storage account**. In the **Create storage account** window, type a name for the storage account. Complete the rest of the fields, select your desired region, and then select **Create**.
180180

181-
![Create storage account](./media/event-hubs-dotnet-framework-getstarted-receive-eph/create-storage2.png)
181+
![Create a storage account in Azure portal](./media/event-hubs-dotnet-framework-getstarted-receive-eph/create-azure-storage-account.png)
182182

183183
3. Select the newly created storage account, and then select **Access Keys**:
184184

185-
![Get access keys](./media/event-hubs-dotnet-framework-getstarted-receive-eph/create-storage3.png)
185+
![Get your access keys in Azure portal](./media/event-hubs-dotnet-framework-getstarted-receive-eph/select-azure-storage-access-keys.png)
186186

187187
Copy the key1 value to a temporary location. You use it later in this tutorial.
188188

articles/machine-learning/service/how-to-use-mlflow.md

Lines changed: 16 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ ms.service: machine-learning
99
ms.subservice: core
1010
ms.reviewer: nibaccam
1111
ms.topic: conceptual
12-
ms.date: 08/07/2019
12+
ms.date: 09/23/2019
1313
ms.custom: seodec18
1414
---
1515

@@ -141,6 +141,7 @@ MLflow Tracking with Azure Machine Learning lets you store the logged metrics an
141141
To run your Mlflow experiments with Azure Databricks, you need to first create an [Azure Databricks workspace and cluster](https://docs.microsoft.com/azure/azure-databricks/quickstart-create-databricks-workspace-portal)
142142

143143
In your cluster, be sure to install the *azureml-mlflow* library from PyPi, to ensure that your cluster has access to the necessary functions and classes.
144+
From here, import your experiment notebook, attach your cluster to it and run your experiment.
144145

145146
### Install libraries
146147

@@ -179,11 +180,18 @@ workspace_name = 'workspace_name'
179180
ws = Workspace.get(name=workspace_name,
180181
subscription_id=subscription_id,
181182
resource_group=resource_group)
182-
183183
```
184+
185+
#### Connect your Azure Databricks and Azure Machine Learning workspaces
186+
187+
On the [Azure portal](https://ms.portal.azure.com), you can link your Azure Databricks (ADB) workspace to a new or existing Azure Machine Learning workspace. To do so, navigate to your ADB workspace and select the **Link Azure Machine Learning workspace** button on the bottom right. Linking your workspaces enables you to track your experiment data in the Azure Machine Learning workspace.
188+
184189
### Link MLflow tracking to your workspace
190+
185191
After you instantiate your workspace, set the MLflow tracking URI. By doing so, you link the MLflow tracking to Azure Machine Learning workspace. After this, all your experiments will land in the managed Azure Machine Learning tracking service.
186192

193+
#### Directly set MLflow Tracking in your notebook
194+
187195
```python
188196
uri = ws.get_mlflow_tracking_uri()
189197
mlflow.set_tracking_uri(uri)
@@ -196,6 +204,12 @@ import mlflow
196204
mlflow.log_metric('epoch_loss', loss.item())
197205
```
198206

207+
#### Automate setting MLflow Tracking
208+
209+
Instead of manually setting the tracking URI in every subsequent experiment notebook session on your clusters, do so automatically using this [Azure Machine Learning Tracking Cluster Init script](https://github.com/Azure/MachineLearningNotebooks/blob/3ce779063b000e0670bdd1acc6bc3a4ee707ec13/how-to-use-azureml/azure-databricks/linking/README.md).
210+
211+
When configured correctly, you are able to see your MLflow tracking data in Azure Machine Learning's REST API and all clients, and in Azure Databricks via the MLflow user interface or by using the MLflow client.
212+
199213
## View metrics and artifacts in your workspace
200214

201215
The metrics and artifacts from MLflow logging are kept in your workspace. To view them anytime, navigate to your workspace and find the experiment by name on the [Azure portal](https://portal.azure.com) or in your [workspace landing page (preview)](https://ml.azure.com). Or run the below code.

0 commit comments

Comments
 (0)