You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/storage/blobs/data-lake-storage-directory-file-acl-python.md
+19-17Lines changed: 19 additions & 17 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,12 +1,12 @@
1
1
---
2
2
title: Use Python to manage data in Azure Data Lake Storage Gen2
3
3
titleSuffix: Azure Storage
4
-
description: Use Python to manage directories and files in storage accounts that has hierarchical namespace enabled.
4
+
description: Use Python to manage directories and files in a storage account that has hierarchical namespace enabled.
5
5
author: pauljewellmsft
6
6
7
7
ms.author: pauljewell
8
8
ms.service: storage
9
-
ms.date: 02/17/2021
9
+
ms.date: 12/20/2022
10
10
ms.topic: how-to
11
11
ms.subservice: data-lake-storage-gen2
12
12
ms.reviewer: prishet
@@ -26,7 +26,7 @@ To learn about how to get, set, and update the access control lists (ACL) of dir
26
26
27
27
- An Azure subscription. See [Get Azure free trial](https://azure.microsoft.com/pricing/free-trial/).
28
28
29
-
- A storage account that has hierarchical namespace enabled. Follow [these](create-data-lake-storage-account.md) instructions to create one.
29
+
- A storage account that has [hierarchical namespace](./data-lake-storage-namespace.md) enabled. Follow [these](create-data-lake-storage-account.md) instructions to create one.
30
30
31
31
## Set up your project
32
32
@@ -47,11 +47,11 @@ from azure.storage.filedatalake._models import ContentSettings
47
47
48
48
## Connect to the account
49
49
50
-
To use the snippets in this article, you'll need to create a **DataLakeServiceClient** instance that represents the storage account.
50
+
To use the snippets in this article, you'll need to create a [DataLakeServiceClient](/python/api/azure-storage-file-datalake/azure.storage.filedatalake.datalakeserviceclient) instance that represents the storage account.
51
51
52
52
### Connect by using an account key
53
53
54
-
This is the easiest way to connect to an account.
54
+
Using an account key is the easiest way to connect to an account.
55
55
56
56
This example creates a **DataLakeServiceClient** instance by using an account key.
57
57
@@ -65,71 +65,73 @@ This example creates a **DataLakeServiceClient** instance by using an account ke
65
65
66
66
You can use the [Azure identity client library for Python](https://pypi.org/project/azure-identity/) to authenticate your application with Azure AD.
67
67
68
-
This example creates a **DataLakeServiceClient** instance by using a client ID, a client secret, and a tenant ID. To get these values, see [Acquire a token from Azure AD for authorizing requests from a client application](../common/storage-auth-aad-app.md).
68
+
This example creates a **DataLakeServiceClient** instance with a [ClientSecretCredential](/python/api/azure-identity/azure.identity.clientsecretcredential) credential. You create the credential object with a client ID, a client secret, and a tenant ID representing a security principal. To get these values, see [Authorize access to blob or queue data from a native or web application](../common/storage-auth-aad-app.md). Assign the **Storage Blob Data Contributor** role to the service principal.
To use this code, make sure you have an import statement for the **ClientSecretCredential**: `from azure.identity import ClientSecretCredential`.
73
+
72
74
> [!NOTE]
73
75
> For more examples, see the [Azure identity client library for Python](https://pypi.org/project/azure-identity/) documentation.
74
76
75
77
## Create a container
76
78
77
-
A container acts as a file system for your files. You can create one by calling the **FileSystemDataLakeServiceClient.create_file_system** method.
79
+
A container acts as a file system for your files. You can create one by calling the [DataLakeServiceClient.create_file_system method](/python/api/azure-storage-file-datalake/azure.storage.filedatalake.datalakeserviceclient#azure-storage-filedatalake-datalakeserviceclient-create-file-system).
78
80
79
81
This example creates a container named `my-file-system`.
Create a directory reference by calling the **FileSystemClient.create_directory** method.
87
+
Create a directory reference by calling the [FileSystemClient.create_directory](/python/api/azure-storage-file-datalake/azure.storage.filedatalake.filesystemclient#azure-storage-filedatalake-filesystemclient-create-directory) method.
86
88
87
89
This example adds a directory named `my-directory` to a container.
Rename or move a directory by calling the **DataLakeDirectoryClient.rename_directory** method. Pass the path of the desired directory a parameter.
95
+
Rename or move a directory by calling the [DataLakeDirectoryClient.rename_directory](/python/api/azure-storage-file-datalake/azure.storage.filedatalake.datalakedirectoryclient#azure-storage-filedatalake-datalakedirectoryclient-rename-directory) method. Pass the path of the desired directory a parameter.
94
96
95
-
This example renames a sub-directory to the name `my-directory-renamed`.
97
+
This example renames a subdirectory to the name `my-directory-renamed`.
Delete a directory by calling the **DataLakeDirectoryClient.delete_directory** method.
103
+
Delete a directory by calling the [DataLakeDirectoryClient.delete_directory](/python/api/azure-storage-file-datalake/azure.storage.filedatalake.datalakedirectoryclient#azure-storage-filedatalake-datalakedirectoryclient-delete-directory) method.
102
104
103
105
This example deletes a directory named `my-directory`.
First, create a file reference in the target directory by creating an instance of the **DataLakeFileClient** class. Upload a file by calling the **DataLakeFileClient.append_data** method. Make sure to complete the upload by calling the **DataLakeFileClient.flush_data** method.
111
+
First, create a file reference in the target directory by creating an instance of the **DataLakeFileClient** class. Upload a file by calling the [DataLakeFileClient.append_data](/python/api/azure-storage-file-datalake/azure.storage.filedatalake.datalakefileclient#azure-storage-filedatalake-datalakefileclient-append-data) method. Make sure to complete the upload by calling the [DataLakeFileClient.flush_data](/python/api/azure-storage-file-datalake/azure.storage.filedatalake.datalakefileclient#azure-storage-filedatalake-datalakefileclient-flush-data) method.
110
112
111
113
This example uploads a text file to a directory named `my-directory`.
> If your file size is large, your code will have to make multiple calls to the **DataLakeFileClient.append_data** method. Consider using the **DataLakeFileClient.upload_data** method instead. That way, you can upload the entire file in a single call.
118
+
> If your file size is large, your code will have to make multiple calls to the **DataLakeFileClient**[append_data](/python/api/azure-storage-file-datalake/azure.storage.filedatalake.datalakefileclient#azure-storage-filedatalake-datalakefileclient-append-data)method. Consider using the [upload_data](/python/api/azure-storage-file-datalake/azure.storage.filedatalake.datalakefileclient#azure-storage-filedatalake-datalakefileclient-upload-data) method instead. That way, you can upload the entire file in a single call.
117
119
118
120
## Upload a large file to a directory
119
121
120
-
Use the **DataLakeFileClient.upload_data** method to upload large files without having to make multiple calls to the **DataLakeFileClient.append_data** method.
122
+
Use the [DataLakeFileClient.upload_data](/python/api/azure-storage-file-datalake/azure.storage.filedatalake.datalakefileclient#azure-storage-filedatalake-datalakefileclient-upload-data) method to upload large files without having to make multiple calls to the [DataLakeFileClient.append_data](/python/api/azure-storage-file-datalake/azure.storage.filedatalake.datalakefileclient#azure-storage-filedatalake-datalakefileclient-append-data) method.
Open a local file for writing. Then, create a **DataLakeFileClient** instance that represents the file that you want to download. Call the **DataLakeFileClient.read_file** to read bytes from the file and then write those bytes to the local file.
128
+
Open a local file for writing. Then, create a **DataLakeFileClient** instance that represents the file that you want to download. Call the [DataLakeFileClient.download_file](/python/api/azure-storage-file-datalake/azure.storage.filedatalake.datalakefileclient#azure-storage-filedatalake-datalakefileclient-download-file) to read bytes from the file and then write those bytes to the local file.
List directory contents by calling the **FileSystemClient.get_paths** method, and then enumerating through the results.
134
+
List directory contents by calling the [FileSystemClient.get_paths](/python/api/azure-storage-file-datalake/azure.storage.filedatalake.filesystemclient#azure-storage-filedatalake-filesystemclient-get-paths) method, and then enumerating through the results.
133
135
134
136
This example, prints the path of each subdirectory and file that is located in a directory named `my-directory`.
135
137
@@ -138,7 +140,7 @@ This example, prints the path of each subdirectory and file that is located in a
0 commit comments