Skip to content

Commit 9ff0b41

Browse files
Merge pull request #292354 from Padmalathas/UUF-DecemberFixes-Patch
UUF Updates and Fixes
2 parents 7af7265 + ff4425f commit 9ff0b41

File tree

2 files changed

+39
-4
lines changed

2 files changed

+39
-4
lines changed

articles/batch/managed-identity-pools.md

Lines changed: 23 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,8 +2,9 @@
22
title: Configure managed identities in Batch pools
33
description: Learn how to enable user-assigned managed identities on Batch pools and how to use managed identities within the nodes.
44
ms.topic: conceptual
5-
ms.date: 08/12/2024
5+
ms.date: 12/23/2024
66
ms.devlang: csharp
7+
ai-usage: ai-assisted
78
ms.custom:
89
---
910
# Configure managed identities in Batch pools
@@ -13,6 +14,10 @@ complicated identity and credential management by providing an identity for the
1314
(Azure AD ID). This identity is used to obtain Microsoft Entra tokens to authenticate with target
1415
resources in Azure.
1516

17+
When adding a User-Assigned Managed Identity to a Batch Pool, it is crucial to set the *Identity* property in your configuration. This property links the managed identity to the pool, enabling it to access Azure resources securely. Incorrect setting of the *Identity* property can result in common errors, such as access issues or upload errors.
18+
19+
For more information on configuring managed identities in Azure Batch, please refer to the [Azure Batch Managed Identities documentation](/troubleshoot/azure/hpc/batch/use-managed-identities-azure-batch-account-pool).
20+
1621
This topic explains how to enable user-assigned managed identities on Batch pools and how to use managed identities within the nodes.
1722

1823
> [!IMPORTANT]
@@ -106,6 +111,23 @@ ArmOperation<BatchAccountPoolResource> armOperation = batchAccount.GetBatchAccou
106111
BatchAccountPoolResource pool = armOperation.Value;
107112
```
108113

114+
> [!NOTE]
115+
> To include the *Identity* property use the following example code:
116+
```csharp
117+
var pool = batchClient.PoolOperations.CreatePool(
118+
poolId: "myPool",
119+
virtualMachineSize: "STANDARD_D2_V2",
120+
cloudServiceConfiguration: new CloudServiceConfiguration(osFamily: "4"),
121+
targetDedicatedNodes: 1,
122+
identity: new PoolIdentity(
123+
type: PoolIdentityType.UserAssigned,
124+
userAssignedIdentities: new Dictionary<string, UserAssignedIdentity>
125+
{
126+
{ "/subscriptions/{subscription-id}/resourceGroups/{resource-group}/providers/Microsoft.ManagedIdentity/userAssignedIdentities/{identity-name}", new UserAssignedIdentity() }
127+
}
128+
));
129+
```
130+
109131
## Use user-assigned managed identities in Batch nodes
110132

111133
Many Azure Batch functions that access other Azure resources directly on the compute nodes, such as Azure Storage or

articles/batch/tutorial-run-python-batch-azure-data-factory.md

Lines changed: 16 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,8 @@ title: 'Tutorial: Run a Batch job through Azure Data Factory'
33
description: Learn how to use Batch Explorer, Azure Storage Explorer, and a Python script to run a Batch workload through an Azure Data Factory pipeline.
44
ms.devlang: python
55
ms.topic: tutorial
6-
ms.date: 03/01/2024
6+
ms.date: 12/23/2024
7+
ai-usage: ai-assisted
78
ms.custom: mvc, devx-track-python
89
---
910

@@ -82,8 +83,10 @@ Paste the connection string into the following script, replacing the `<storage-a
8283
8384
``` python
8485
# Load libraries
85-
from azure.storage.blob import BlobClient
86+
# from azure.storage.blob import BlobClient
87+
from azure.storage.blob import BlobServiceClient
8688
import pandas as pd
89+
import io
8790

8891
# Define parameters
8992
connectionString = "<storage-account-connection-string>"
@@ -93,8 +96,16 @@ outputBlobName = "iris_setosa.csv"
9396
# Establish connection with the blob storage account
9497
blob = BlobClient.from_connection_string(conn_str=connectionString, container_name=containerName, blob_name=outputBlobName)
9598

99+
# Initialize the BlobServiceClient (This initializes a connection to the Azure Blob Storage, downloads the content of the 'iris.csv' file, and then loads it into a Pandas DataFrame for further processing.)
100+
blob_service_client = BlobServiceClient.from_connection_string(conn_str=connectionString)
101+
blob_client = blob_service_client.get_blob_client(container_name=containerName, blob_name=outputBlobName)
102+
103+
# Download the blob content
104+
blob_data = blob_client.download_blob().readall()
105+
96106
# Load iris dataset from the task node
97-
df = pd.read_csv("iris.csv")
107+
# df = pd.read_csv("iris.csv")
108+
df = pd.read_csv(io.BytesIO(blob_data))
98109

99110
# Take a subset of the records
100111
df = df[df['Species'] == "setosa"]
@@ -106,6 +117,8 @@ with open(outputBlobName, "rb") as data:
106117
blob.upload_blob(data, overwrite=True)
107118
```
108119

120+
For more information on working with Azure Blob Storage, refer to the [Azure Blob Storage documentation](/azure/storage/blobs/storage-blobs-introduction).
121+
109122
Run the script locally to test and validate functionality.
110123

111124
``` bash

0 commit comments

Comments
 (0)