You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/batch/tutorial-run-python-batch-azure-data-factory.md
+16-3Lines changed: 16 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,7 +3,8 @@ title: 'Tutorial: Run a Batch job through Azure Data Factory'
3
3
description: Learn how to use Batch Explorer, Azure Storage Explorer, and a Python script to run a Batch workload through an Azure Data Factory pipeline.
4
4
ms.devlang: python
5
5
ms.topic: tutorial
6
-
ms.date: 03/01/2024
6
+
ms.date: 12/23/2024
7
+
ai-usage: ai-assisted
7
8
ms.custom: mvc, devx-track-python
8
9
---
9
10
@@ -82,8 +83,10 @@ Paste the connection string into the following script, replacing the `<storage-a
# Initialize the BlobServiceClient (This initializes a connection to the Azure Blob Storage, downloads the content of the 'iris.csv' file, and then loads it into a Pandas DataFrame for further processing.)
@@ -106,6 +117,8 @@ with open(outputBlobName, "rb") as data:
106
117
blob.upload_blob(data, overwrite=True)
107
118
```
108
119
120
+
For more information on working with Azure Blob Storage, refer to the [Azure Blob Storage documentation](https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blobs-introduction).
121
+
109
122
Run the script locally to test and validate functionality.
0 commit comments