Skip to content

Commit 90a8d7c

Browse files
authored
Merge pull request #108022 from normesta/normesta-sdk-interop
Normesta sdk interop
2 parents 6a33498 + 4b9fb43 commit 90a8d7c

4 files changed

+125
-14
lines changed

articles/storage/blobs/data-lake-storage-directory-file-acl-dotnet.md

Lines changed: 27 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ title: Azure Data Lake Storage Gen2 .NET SDK for files & ACLs (preview)
33
description: Use the Azure Storage client library to manage directories and file and directory access control lists (ACL) in storage accounts that has hierarchical namespace (HNS) enabled.
44
author: normesta
55
ms.service: storage
6-
ms.date: 01/09/2020
6+
ms.date: 03/18/2020
77
ms.author: normesta
88
ms.topic: article
99
ms.subservice: data-lake-storage-gen2
@@ -197,6 +197,32 @@ public async Task UploadFile(DataLakeFileSystemClient fileSystemClient)
197197
}
198198
```
199199

200+
> [!TIP]
201+
> If your file size is large, your code will have to make multiple calls to the [DataLakeFileClient.AppendAsync](https://docs.microsoft.com/dotnet/api/azure.storage.files.datalake.datalakefileclient.appendasync). Consider using the [DataLakeFileClient.UploadAsync](https://docs.microsoft.com/dotnet/api/azure.storage.files.datalake.datalakefileclient.uploadasync?view=azure-dotnet-preview#Azure_Storage_Files_DataLake_DataLakeFileClient_UploadAsync_System_IO_Stream_) method instead. That way, you can upload the entire file in a single call.
202+
>
203+
> See the next section for an example.
204+
205+
## Upload a large file to a directory
206+
207+
Use the [DataLakeFileClient.UploadAsync](https://docs.microsoft.com/dotnet/api/azure.storage.files.datalake.datalakefileclient.uploadasync?view=azure-dotnet-preview#Azure_Storage_Files_DataLake_DataLakeFileClient_UploadAsync_System_IO_Stream_) method to upload large files without having to make multiple calls to the [DataLakeFileClient.AppendAsync](https://docs.microsoft.com/dotnet/api/azure.storage.files.datalake.datalakefileclient.appendasync) method.
208+
209+
```cs
210+
public async Task UploadFileBulk(DataLakeFileSystemClient fileSystemClient)
211+
{
212+
DataLakeDirectoryClient directoryClient =
213+
fileSystemClient.GetDirectoryClient("my-directory");
214+
215+
DataLakeFileClient fileClient = directoryClient.GetFileClient("uploaded-file.txt");
216+
217+
FileStream fileStream =
218+
File.OpenRead("C:\\file-to-upload.txt");
219+
220+
await fileClient.UploadAsync(fileStream);
221+
222+
}
223+
224+
```
225+
200226
## Manage a file ACL
201227

202228
Get the access control list (ACL) of a file by calling the [DataLakeFileClient.GetAccessControlAsync](https://docs.microsoft.com/dotnet/api/azure.storage.files.datalake.datalakefileclient.getaccesscontrolasync) method and set the ACL by calling the [DataLakeFileClient.SetAccessControlList](https://docs.microsoft.com/dotnet/api/azure.storage.files.datalake.datalakefileclient.setaccesscontrollist) method.

articles/storage/blobs/data-lake-storage-directory-file-acl-java.md

Lines changed: 68 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ title: Azure Data Lake Storage Gen2 Java SDK for files & ACLs (preview)
33
description: Use Azure Storage libraries for Java to manage directories and file and directory access control lists (ACL) in storage accounts that has hierarchical namespace (HNS) enabled.
44
author: normesta
55
ms.service: storage
6-
ms.date: 11/24/2019
6+
ms.date: 03/18/2020
77
ms.author: normesta
88
ms.topic: conceptual
99
ms.subservice: data-lake-storage-gen2
@@ -29,9 +29,12 @@ This article shows you how to use Java to create and manage directories, files,
2929

3030
To get started, open [this page](https://search.maven.org/artifact/com.azure/azure-storage-file-datalake) and find the latest version of the Java library. Then, open the *pom.xml* file in your text editor. Add a dependency element that references that version.
3131

32+
If you plan to authenticate your client application by using Azure Active Directory (AD), then add a dependency to the Azure Secret Client Library. See [Adding the Secret Client Library package to your project](https://github.com/Azure/azure-sdk-for-java/tree/master/sdk/identity/azure-identity#adding-the-package-to-your-project).
33+
3234
Next, add these imports statements to your code file.
3335

3436
```java
37+
import com.azure.core.credential.TokenCredential;
3538
import com.azure.storage.common.StorageSharedKeyCredential;
3639
import com.azure.storage.file.datalake.DataLakeDirectoryClient;
3740
import com.azure.storage.file.datalake.DataLakeFileClient;
@@ -46,6 +49,14 @@ import com.azure.storage.file.datalake.models.PathPermissions;
4649
import com.azure.storage.file.datalake.models.RolePermissions;
4750
```
4851

52+
If you plan to authenticate your client application by using Azure AD, then add these imports statements to your code file.
53+
54+
```java
55+
import com.azure.identity.ClientSecretCredential;
56+
import com.azure.identity.ClientSecretCredentialBuilder;
57+
import com.azure.core.credential.TokenCredential;
58+
````
59+
4960
## Connect to the account
5061

5162
To use the snippets in this article, you'll need to create a **DataLakeServiceClient** instance that represents the storage account. The easiest way to get one is to use an account key.
@@ -116,7 +127,8 @@ static public DataLakeDirectoryClient
116127
DataLakeDirectoryClient directoryClient =
117128
fileSystemClient.getDirectoryClient("my-directory/my-subdirectory");
118129
119-
return directoryClient.rename("my-directory/my-subdirectory-renamed");
130+
return directoryClient.rename(
131+
fileSystemClient.getFileSystemName(),"my-subdirectory-renamed");
120132
}
121133
```
122134
@@ -129,7 +141,8 @@ static public DataLakeDirectoryClient MoveDirectory
129141
DataLakeDirectoryClient directoryClient =
130142
fileSystemClient.getDirectoryClient("my-directory/my-subdirectory-renamed");
131143
132-
return directoryClient.rename("my-directory-2/my-subdirectory-renamed");
144+
return directoryClient.rename(
145+
fileSystemClient.getFileSystemName(),"my-directory-2/my-subdirectory-renamed");
133146
}
134147
```
135148
@@ -169,11 +182,20 @@ static public void ManageDirectoryACLs(DataLakeFileSystemClient fileSystemClient
169182
170183
System.out.println(PathAccessControlEntry.serializeList(pathPermissions));
171184
172-
PathPermissions permissions = new PathPermissions()
173-
174-
.group(new RolePermissions().execute(true).read(true))
175-
.owner(new RolePermissions().execute(true).read(true).write(true))
176-
.other(new RolePermissions().read(true));
185+
RolePermissions groupPermission = new RolePermissions();
186+
groupPermission.setExecutePermission(true).setReadPermission(true);
187+
188+
RolePermissions ownerPermission = new RolePermissions();
189+
ownerPermission.setExecutePermission(true).setReadPermission(true).setWritePermission(true);
190+
191+
RolePermissions otherPermission = new RolePermissions();
192+
otherPermission.setReadPermission(true);
193+
194+
PathPermissions permissions = new PathPermissions();
195+
196+
permissions.setGroup(groupPermission);
197+
permissions.setOwner(ownerPermission);
198+
permissions.setOther(otherPermission);
177199
178200
directoryClient.setPermissions(permissions, null, null);
179201
@@ -212,6 +234,31 @@ static public void UploadFile(DataLakeFileSystemClient fileSystemClient)
212234
}
213235
```
214236
237+
> [!TIP]
238+
> If your file size is large, your code will have to make multiple calls to the **DataLakeFileClient.append** method. Consider using the **DataLakeFileClient.uploadFromFile** method instead. That way, you can upload the entire file in a single call.
239+
>
240+
> See the next section for an example.
241+
242+
## Upload a large file to a directory
243+
244+
Use the **DataLakeFileClient.uploadFromFile** method to upload large files without having to make multiple calls to the **DataLakeFileClient.append** method.
245+
246+
```java
247+
static public void UploadFileBulk(DataLakeFileSystemClient fileSystemClient)
248+
throws FileNotFoundException{
249+
250+
DataLakeDirectoryClient directoryClient =
251+
fileSystemClient.getDirectoryClient("my-directory");
252+
253+
DataLakeFileClient fileClient = directoryClient.getFileClient("uploaded-file.txt");
254+
255+
fileClient.uploadFromFile("C:\\mytestfile.txt");
256+
257+
}
258+
259+
```
260+
261+
215262
## Manage a file ACL
216263
217264
This example gets and then sets the ACL of a file named `upload-file.txt`. This example gives the owning user read, write, and execute permissions, gives the owning group only read and execute permissions, and gives all others read access.
@@ -235,11 +282,20 @@ static public void ManageFileACLs(DataLakeFileSystemClient fileSystemClient){
235282
236283
System.out.println(PathAccessControlEntry.serializeList(pathPermissions));
237284
238-
PathPermissions permissions = new PathPermissions()
285+
RolePermissions groupPermission = new RolePermissions();
286+
groupPermission.setExecutePermission(true).setReadPermission(true);
287+
288+
RolePermissions ownerPermission = new RolePermissions();
289+
ownerPermission.setExecutePermission(true).setReadPermission(true).setWritePermission(true);
290+
291+
RolePermissions otherPermission = new RolePermissions();
292+
otherPermission.setReadPermission(true);
293+
294+
PathPermissions permissions = new PathPermissions();
239295
240-
.group(new RolePermissions().execute(true).read(true))
241-
.owner(new RolePermissions().execute(true).read(true).write(true))
242-
.other(new RolePermissions().read(false));
296+
permissions.setGroup(groupPermission);
297+
permissions.setOwner(ownerPermission);
298+
permissions.setOther(otherPermission);
243299
244300
fileClient.setPermissions(permissions, null, null);
245301

articles/storage/blobs/data-lake-storage-directory-file-acl-javascript.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ title: Use JavaScript for files & ACLs in Azure Data Lake Storage Gen2 (preview)
33
description: Use Azure Storage Data Lake client library for JavaScript to manage directories and file and directory access control lists (ACL) in storage accounts that has hierarchical namespace (HNS) enabled.
44
author: normesta
55
ms.service: storage
6-
ms.date: 12/18/2019
6+
ms.date: 03/18/2020
77
ms.author: normesta
88
ms.topic: conceptual
99
ms.subservice: data-lake-storage-gen2

articles/storage/blobs/data-lake-storage-directory-file-acl-python.md

Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -38,6 +38,8 @@ Add these import statements to the top of your code file.
3838
```python
3939
import os, uuid, sys
4040
from azure.storage.filedatalake import DataLakeServiceClient
41+
from azure.core._match_conditions import MatchConditions
42+
from azure.storage.filedatalake._models import ContentSettings
4143
```
4244

4345
## Connect to the account
@@ -190,6 +192,33 @@ def upload_file_to_directory():
190192
print(e)
191193
```
192194

195+
> [!TIP]
196+
> If your file size is large, your code will have to make multiple calls to the **DataLakeFileClient.append_data** method. Consider using the **DataLakeFileClient.upload_data** method instead. That way, you can upload the entire file in a single call.
197+
198+
## Upload a large file to a directory
199+
200+
Use the **DataLakeFileClient.upload_data** method to upload large files without having to make multiple calls to the **DataLakeFileClient.append_data** method.
201+
202+
```python
203+
def upload_file_to_directory_bulk():
204+
try:
205+
206+
file_system_client = service_client.get_file_system_client(file_system="my-file-system")
207+
208+
directory_client = file_system_client.get_directory_client("my-directory")
209+
210+
file_client = directory_client.get_file_client("uploaded-file.txt")
211+
212+
local_file = open("C:\\file-to-upload.txt",'r')
213+
214+
file_contents = local_file.read()
215+
216+
file_client.upload_data(file_contents, overwrite=True)
217+
218+
except Exception as e:
219+
print(e)
220+
```
221+
193222
## Manage file permissions
194223

195224
Get the access control list (ACL) of a file by calling the **DataLakeFileClient.get_access_control** method and set the ACL by calling the **DataLakeFileClient.set_access_control** method.

0 commit comments

Comments
 (0)