Skip to content

Commit 7c9cb74

Browse files
Merge pull request #264825 from ynpandey/patch-43
Update how-to-managed-network.md
2 parents b9bffcd + 2596103 commit 7c9cb74

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

articles/machine-learning/how-to-managed-network.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -702,7 +702,7 @@ To enable the [serverless Spark jobs](how-to-submit-spark-jobs.md) for the manag
702702
Use a YAML file to define the managed VNet configuration and add a private endpoint for the Azure Storage Account. Also set `spark_enabled: true`:
703703

704704
> [!TIP]
705-
> This example is for a managed VNet configured using `isolation_mode: allow_internet_outbound` to allow internet traffic. If you want to allow only approved outbound traffic to enable data exfiltration protection (DEP), use `isolation_mode: allow_only_approved_outbound`.
705+
> This example is for a managed VNet configured using `isolation_mode: allow_internet_outbound` to allow internet traffic. If you want to allow only approved outbound traffic, use `isolation_mode: allow_only_approved_outbound`.
706706

707707
```yml
708708
name: myworkspace
@@ -724,15 +724,15 @@ To enable the [serverless Spark jobs](how-to-submit-spark-jobs.md) for the manag
724724
```
725725

726726
> [!NOTE]
727-
> - When data exfiltration protection (DEP) is enabled, conda package dependencies defined in Spark session configuration will fail to install. To resolve this problem, upload a self-contained Python package wheel with no external dependencies to an Azure storage account and create private endpoint to this storage account. Use the path to Python package wheel as `py_files` parameter in your Spark job.
727+
> - When **Allow Only Approved Outbound** is enabled (`isolation_mode: allow_only_approved_outbound`), conda package dependencies defined in Spark session configuration will fail to install. To resolve this problem, upload a self-contained Python package wheel with no external dependencies to an Azure storage account and create private endpoint to this storage account. Use the path to Python package wheel as `py_files` parameter in your Spark job.
728728
> - If the workspace was created with `isolation_mode: allow_internet_outbound`, it can not be updated later to use `isolation_mode: allow_only_approved_outbound`.
729729

730730
# [Python SDK](#tab/python)
731731

732732
The following example demonstrates how to create a managed VNet for an existing Azure Machine Learning workspace named `myworkspace`. It also adds a private endpoint for the Azure Storage Account and sets `spark_enabled=true`:
733733

734734
> [!TIP]
735-
> The following example is for a managed VNet configured using `IsolationMode.ALLOW_INTERNET_OUTBOUND` to allow internet traffic. If you want to allow only approved outbound traffic to enable data exfiltration protection (DEP), use `IsolationMode.ALLOW_ONLY_APPROVED_OUTBOUND`.
735+
> The following example is for a managed VNet configured using `IsolationMode.ALLOW_INTERNET_OUTBOUND` to allow internet traffic. If you want to allow only approved outbound traffic, use `IsolationMode.ALLOW_ONLY_APPROVED_OUTBOUND`.
736736

737737
```python
738738
# Get the existing workspace
@@ -759,7 +759,7 @@ To enable the [serverless Spark jobs](how-to-submit-spark-jobs.md) for the manag
759759
ml_client.workspaces.begin_update(ws)
760760
```
761761
> [!NOTE]
762-
> - When data exfiltration protection (DEP) is enabled, conda package dependencies defined in Spark session configuration will fail to install. To resolve this problem, upload a self-contained Python package wheel with no external dependencies to an Azure storage account and create private endpoint to this storage account. Use the path to Python package wheel as `py_files` parameter in the Spark job.
762+
> - When **Allow Only Approved Outbound** is enabled (`isolation_mode: allow_only_approved_outbound`), conda package dependencies defined in Spark session configuration will fail to install. To resolve this problem, upload a self-contained Python package wheel with no external dependencies to an Azure storage account and create private endpoint to this storage account. Use the path to Python package wheel as `py_files` parameter in the Spark job.
763763
> - If the workspace was created with `IsolationMode.ALLOW_INTERNET_OUTBOUND`, it can not be updated later to use `IsolationMode.ALLOW_ONLY_APPROVED_OUTBOUND`.
764764

765765

0 commit comments

Comments
 (0)