You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
:::image type="content" source="./media/how-to-use-mlflow-azure-databricks/link-workspaces.png" alt-text="Screenshot shows the Link Azure Databricks and Azure Machine Learning workspaces option.":::
76
+
:::image type="content" source="./media/how-to-use-mlflow-azure-databricks/link-workspaces.png" lightbox="./media/how-to-use-mlflow-azure-databricks/link-workspaces.png" alt-text="Screenshot shows the Link Azure Databricks and Azure Machine Learning workspaces option.":::
77
77
78
78
After you link your Azure Databricks workspace with your Azure Machine Learning workspace, MLflow Tracking is automatically tracked in the following places:
79
79
@@ -109,71 +109,71 @@ Configure the MLflow tracking URI to point exclusively to Azure Machine Learning
You can get the Azure Machine Learning MLflow tracking URI using the [Azure Machine Learning SDK v2 for Python](concept-v2.md). Ensure you have the library `azure-ai-ml` installed in the compute you're using. The following sample gets the unique MLFLow tracking URI associated with your workspace.
134
134
135
-
1. Sign in into your workspace using the `MLClient`. The easier way to do that is by using the workspace config file:
135
+
1. Sign in into your workspace using the `MLClient`. The easier way to do that is by using the workspace config file:
>`DefaultAzureCredential` tries to pull the credentials from the available context. If you want to specify credentials in a different way, for instance using the web browser in an interactive way, you can use `InteractiveBrowserCredential`orany other method available in [`azure.identity`](https://pypi.org/project/azure-identity/) package.
167
+
> [!IMPORTANT]
168
+
> `DefaultAzureCredential` tries to pull the credentials from the available context. If you want to specify credentials in a different way, for instance using the web browser in an interactive way, you can use `InteractiveBrowserCredential` or any other method available in [`azure.identity`](https://pypi.org/project/azure-identity/) package.
Use the Azure Machine Learning portal to get the tracking URI:
179
179
@@ -182,7 +182,7 @@ Configure the MLflow tracking URI to point exclusively to Azure Machine Learning
182
182
1. Select **View all properties in Azure Portal**.
183
183
1. On the **Essentials** section, find the property **MLflow tracking URI**.
184
184
185
-
# [Manually](#tab/manual)
185
+
# [Manually](#tab/manual)
186
186
187
187
You can construct the Azure Machine Learning Tracking URI using the subscription ID, region of where the resource is deployed, resource group name, and workspace name. The following code sample shows how:
188
188
@@ -198,11 +198,11 @@ Configure the MLflow tracking URI to point exclusively to Azure Machine Learning
Then the method [`set_tracking_uri()`](https://mlflow.org/docs/latest/python_api/mlflow.html#mlflow.set_tracking_uri) points the MLflow tracking URI to that URI.
208
208
@@ -212,40 +212,40 @@ Configure the MLflow tracking URI to point exclusively to Azure Machine Learning
212
212
mlflow.set_tracking_uri(mlflow_tracking_uri)
213
213
```
214
214
215
-
# [Using environment variables](#tab/environ)
215
+
# [Use environment variables](#tab/environ)
216
216
217
217
You can set the MLflow environment variables [MLFLOW_TRACKING_URI](https://mlflow.org/docs/latest/tracking.html#logging-to-a-tracking-server) in your compute to make any interaction with MLflow in the compute to point by default to Azure Machine Learning.
218
218
219
219
```bash
220
220
MLFLOW_TRACKING_URI=$(az ml workspace show --query mlflow_tracking_uri | sed 's/"//g')
221
221
```
222
222
223
-
---
223
+
---
224
224
225
-
> [!TIP]
226
-
> When working with shared environments, like an Azure Databricks cluster, Azure Synapse Analytics cluster, or similar, you can set the environment variable `MLFLOW_TRACKING_URI` at the cluster level. This approach allows you to automatically configure the MLflow tracking URI to point to Azure Machine Learning for all the sessions that run in the cluster rather than to do it on a per-session basis.
227
-
>
228
-
> :::image type="content" source="./media/how-to-use-mlflow-azure-databricks/env.png" alt-text="Screenshot shows Acvanced options where you can configure the environment variables in an Azure Databricks cluster.":::
229
-
>
230
-
> After you configure the environment variable, any experiment running in such cluster is tracked in Azure Machine Learning.
225
+
> [!TIP]
226
+
> When working with shared environments, like an Azure Databricks cluster, Azure Synapse Analytics cluster, or similar, you can set the environment variable `MLFLOW_TRACKING_URI` at the cluster level. This approach allows you to automatically configure the MLflow tracking URI to point to Azure Machine Learning for all the sessions that run in the cluster rather than to do it on a per-session basis.
227
+
>
228
+
> :::image type="content" source="./media/how-to-use-mlflow-azure-databricks/env.png" alt-text="Screenshot shows Acvanced options where you can configure the environment variables in an Azure Databricks cluster.":::
229
+
>
230
+
> After you configure the environment variable, any experiment running in such cluster is tracked in Azure Machine Learning.
231
231
232
232
#### Configure authentication
233
233
234
234
After you configure tracking, configure how to authenticate to the associated workspace. By default, the Azure Machine Learning plugin for MLflow opens a browser to interactively prompt for credentials. For other ways to configure authentication for MLflow in Azure Machine Learning workspaces, see [Configure MLflow for Azure Machine Learning: Configure authentication](how-to-use-mlflow-configure-tracking.md#configure-authentication).
When you configure MLflow to exclusively track experiments in Azure Machine Learning workspace, the experiment's naming convention has to follow the one used by Azure Machine Learning. In Azure Databricks, experiments are named with the path to where the experiment is saved, for instance `/Users/[email protected]/iris-classifier`. However, in Azure Machine Learning, you provide the experiment name directly. The same experiment would be named `iris-classifier` directly:
240
+
When you configure MLflow to exclusively track experiments in Azure Machine Learning workspace, the experiment naming convention has to follow the one used by Azure Machine Learning. In Azure Databricks, experiments are named with the path to where the experiment is saved, for instance `/Users/[email protected]/iris-classifier`. However, in Azure Machine Learning, you provide the experiment name directly. The same experiment would be named `iris-classifier` directly:
After this configuration, you can use MLflow in Azure Databricks in the same way as you're used to. For details see [Log & view metrics and log files](how-to-log-view-metrics.md).
248
+
After this configuration, you can use MLflow in Azure Databricks in the same way as you're used to. For more information, see [Log & view metrics and log files](how-to-log-view-metrics.md).
> When working on shared environments, we recommend that you configure these environment variables at the compute. As a best practice, manage them as secrets in an instance of Azure Key Vault. For instance, in Azure Databricks you can use secrets in environment variables as follows in the cluster configuration: `AZURE_CLIENT_SECRET={{secrets/<scope-name>/<secret-name>}}`. For more information about implementing this approach in Azure Databricks, see [Reference a secret in an environment variable](/azure/databricks/security/secrets/secrets#reference-a-secret-in-an-environment-variable) or refer to documentation for your platform.
37
+
> When working on shared environments, we recommend that you configure these environment variables at the compute. As a best practice, manage them as secrets in an instance of Azure Key Vault.
38
+
>
39
+
> For instance, in Azure Databricks you can use secrets in environment variables as follows in the cluster configuration: `AZURE_CLIENT_SECRET={{secrets/<scope-name>/<secret-name>}}`. For more information about implementing this approach in Azure Databricks, see [Reference a secret in an environment variable](/azure/databricks/security/secrets/secrets#reference-a-secret-in-an-environment-variable) or refer to documentation for your platform.
0 commit comments