Skip to content

Commit 8fbf2ae

Browse files
authored
Merge pull request #189567 from normesta/normesta-reg-updates-7
Reverting Vendor updates
2 parents 47bee80 + 95da0ec commit 8fbf2ae

File tree

1 file changed

+59
-84
lines changed

1 file changed

+59
-84
lines changed

articles/data-lake-store/data-lake-store-get-started-java-sdk.md

Lines changed: 59 additions & 84 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ description: Use the Java SDK for Azure Data Lake Storage Gen1 to perform filesy
55
author: normesta
66
ms.service: data-lake-store
77
ms.topic: how-to
8-
ms.date: 05/29/2018
8+
ms.date: 02/23/2022
99
ms.custom: devx-track-java
1010
ms.author: normesta
1111

@@ -38,71 +38,58 @@ The code sample available [on GitHub](https://azure.microsoft.com/documentation/
3838

3939
```xml
4040
<dependencies>
41-
<dependency>
42-
<groupId>com.azure</groupId>
43-
<artifactId>azure-identity</artifactId>
44-
<version>1.4.1</version>
45-
</dependency>
46-
<dependency>
47-
<groupId>com.azure</groupId>
48-
<artifactId>azure-storage-file-datalake</artifactId>
49-
<version>12.7.2</version>
50-
</dependency>
51-
<dependency>
52-
<groupId>org.slf4j</groupId>
53-
<artifactId>slf4j-nop</artifactId>
54-
<version>1.7.32</version>
55-
</dependency>
41+
<dependency>
42+
<groupId>com.microsoft.azure</groupId>
43+
<artifactId>azure-data-lake-store-sdk</artifactId>
44+
<version>2.1.5</version>
45+
</dependency>
46+
<dependency>
47+
<groupId>org.slf4j</groupId>
48+
<artifactId>slf4j-nop</artifactId>
49+
<version>1.7.21</version>
50+
</dependency>
5651
</dependencies>
5752
```
5853

59-
The second dependency is to use the Data Lake Storage Gen2 SDK (`azure-storage-file-datalake`) from the Maven repository. The third dependency is to specify the logging framework (`slf4j-nop`) to use for this application. The Data Lake Storage Gen2 SDK uses [SLF4J](https://www.slf4j.org/) logging façade, which lets you choose from a number of popular logging frameworks, like Log4j, Java logging, Logback, etc., or no logging. For this example, we disable logging, hence we use the **slf4j-nop** binding. To use other logging options in your app, see [here](https://www.slf4j.org/manual.html#projectDep).
54+
The first dependency is to use the Data Lake Storage Gen1 SDK (`azure-data-lake-store-sdk`) from the maven repository. The second dependency is to specify the logging framework (`slf4j-nop`) to use for this application. The Data Lake Storage Gen1 SDK uses [SLF4J](https://www.slf4j.org/) logging façade, which lets you choose from a number of popular logging frameworks, like Log4j, Java logging, Logback, etc., or no logging. For this example, we disable logging, hence we use the **slf4j-nop** binding. To use other logging options in your app, see [here](https://www.slf4j.org/manual.html#projectDep).
6055

6156
3. Add the following import statements to your application.
6257

6358
```java
64-
import com.azure.identity.ClientSecretCredential;
65-
import com.azure.identity.ClientSecretCredentialBuilder;
66-
import com.azure.storage.file.datalake.DataLakeDirectoryClient;
67-
import com.azure.storage.file.datalake.DataLakeFileClient;
68-
import com.azure.storage.file.datalake.DataLakeServiceClient;
69-
import com.azure.storage.file.datalake.DataLakeServiceClientBuilder;
70-
import com.azure.storage.file.datalake.DataLakeFileSystemClient;
71-
import com.azure.storage.file.datalake.models.ListPathsOptions;
72-
import com.azure.storage.file.datalake.models.PathAccessControl;
73-
import com.azure.storage.file.datalake.models.PathPermissions;
59+
import com.microsoft.azure.datalake.store.ADLException;
60+
import com.microsoft.azure.datalake.store.ADLStoreClient;
61+
import com.microsoft.azure.datalake.store.DirectoryEntry;
62+
import com.microsoft.azure.datalake.store.IfExists;
63+
import com.microsoft.azure.datalake.store.oauth2.AccessTokenProvider;
64+
import com.microsoft.azure.datalake.store.oauth2.ClientCredsTokenProvider;
7465

7566
import java.io.*;
76-
import java.time.Duration;
7767
import java.util.Arrays;
7868
import java.util.List;
79-
import java.util.Map;
8069
```
8170

8271
## Authentication
8372

84-
* For end-user authentication for your application, see [End-user-authentication with Data Lake Storage Gen2 using Java](data-lake-store-end-user-authenticate-java-sdk.md).
85-
* For service-to-service authentication for your application, see [Service-to-service authentication with Data Lake Storage Gen2 using Java](data-lake-store-service-to-service-authenticate-java.md).
73+
* For end-user authentication for your application, see [End-user-authentication with Data Lake Storage Gen1 using Java](data-lake-store-end-user-authenticate-java-sdk.md).
74+
* For service-to-service authentication for your application, see [Service-to-service authentication with Data Lake Storage Gen1 using Java](data-lake-store-service-to-service-authenticate-java.md).
8675

87-
## Create a Data Lake Storage Gen2 client
88-
Creating a [DataLakeServiceClient](https://azure.github.io/azure-sdk-for-java/datalakestorage%28gen2%29.html) object requires you to specify the Data Lake Storage Gen2 account name and the token provider you generated when you authenticated with Data Lake Storage Gen2 (see [Authentication](#authentication) section). The Data Lake Storage Gen2 account name needs to be a fully qualified domain name. For example, replace **FILL-IN-HERE** with something like **mydatalakestoragegen1.azuredatalakestore.net**.
76+
## Create a Data Lake Storage Gen1 client
77+
Creating an [ADLStoreClient](https://azure.github.io/azure-data-lake-store-java/javadoc/) object requires you to specify the Data Lake Storage Gen1 account name and the token provider you generated when you authenticated with Data Lake Storage Gen1 (see [Authentication](#authentication) section). The Data Lake Storage Gen1 account name needs to be a fully qualified domain name. For example, replace **FILL-IN-HERE** with something like **mydatalakestoragegen1.azuredatalakestore.net**.
8978

9079
```java
91-
private static String endPoint = "FILL-IN-HERE"; // Data lake storage end point
92-
DataLakeServiceClient dataLakeServiceClient = new DataLakeServiceClientBuilder().endpoint(endPoint).credential(credential).buildClient();
80+
private static String accountFQDN = "FILL-IN-HERE"; // full account FQDN, not just the account name
81+
ADLStoreClient client = ADLStoreClient.createClient(accountFQDN, provider);
9382
```
9483

95-
The code snippets in the following sections contain examples of some common filesystem operations. You can look at the full [Data Lake Storage Gen2 Java SDK API docs](https://azure.github.io/azure-sdk-for-java/datalakestorage%28gen2%29.html) of the **DataLakeServiceClient** object to see other operations.
84+
The code snippets in the following sections contain examples of some common filesystem operations. You can look at the full [Data Lake Storage Gen1 Java SDK API docs](https://azure.github.io/azure-data-lake-store-java/javadoc/) of the **ADLStoreClient** object to see other operations.
9685

9786
## Create a directory
9887

99-
The following snippet creates a directory structure in the root of the Data Lake Storage Gen2 account you specified.
88+
The following snippet creates a directory structure in the root of the Data Lake Storage Gen1 account you specified.
10089

10190
```java
10291
// create directory
103-
private String fileSystemName = "FILL-IN-HERE"
104-
DataLakeFileSystemClient dataLakeFileSystemClient = dataLakeServiceClient.createFileSystem(fileSystemName);
105-
dataLakeFileSystemClient.createDirectory("a/b/w");
92+
client.createDirectory("/a/b/w");
10693
System.out.println("Directory created.");
10794
```
10895

@@ -112,27 +99,25 @@ The following snippet creates a file (c.txt) in the directory structure and writ
11299

113100
```java
114101
// create file and write some content
115-
String filename = "c.txt";
116-
try (FileOutputStream stream = new FileOutputStream(filename);
117-
PrintWriter out = new PrintWriter(stream)) {
118-
for (int i = 1; i <= 10; i++) {
119-
out.println("This is line #" + i);
120-
out.format("This is the same line (%d), but using formatted output. %n", i);
121-
}
102+
String filename = "/a/b/c.txt";
103+
OutputStream stream = client.createFile(filename, IfExists.OVERWRITE );
104+
PrintStream out = new PrintStream(stream);
105+
for (int i = 1; i <= 10; i++) {
106+
out.println("This is line #" + i);
107+
out.format("This is the same line (%d), but using formatted output. %n", i);
122108
}
123-
dataLakeFileSystemClient.createFile("a/b/" + filename, true);
109+
out.close();
124110
System.out.println("File created.");
125111
```
126112

127113
You can also create a file (d.txt) using byte arrays.
128114

129115
```java
130116
// create file using byte arrays
131-
DataLakeFileClient dataLakeFileClient = dataLakeFileSystemClient.createFile("a/b/d.txt", true);
117+
stream = client.createFile("/a/b/d.txt", IfExists.OVERWRITE);
132118
byte[] buf = getSampleContent();
133-
try (ByteArrayInputStream stream = new ByteArrayInputStream(buf)) {
134-
dataLakeFileClient.upload(stream, buf.length);
135-
}
119+
stream.write(buf);
120+
stream.close();
136121
System.out.println("File created using byte array.");
137122
```
138123

@@ -144,12 +129,10 @@ The following snippet appends content to an existing file.
144129

145130
```java
146131
// append to file
147-
byte[] buf = getSampleContent();
148-
try (ByteArrayInputStream stream = new ByteArrayInputStream(buf)) {
149-
DataLakeFileClient dataLakeFileClient = dataLakeDirectoryClient.getFileClient(filename);
150-
dataLakeFileClient.append(stream, 0, buf.length);
151-
System.out.println("File appended.");
152-
}
132+
stream = client.getAppendStream(filename);
133+
stream.write(getSampleContent());
134+
stream.close();
135+
System.out.println("File appended.");
153136
```
154137

155138
The definition for `getSampleContent` function used in the preceding snippet is available as part of the sample [on GitHub](https://azure.microsoft.com/documentation/samples/data-lake-store-java-upload-download-get-started/).
@@ -160,10 +143,10 @@ The following snippet reads content from a file in a Data Lake Storage Gen1 acco
160143

161144
```java
162145
// Read File
163-
try (InputStream dataLakeIn = dataLakeFileSystemClient.getFileClient(filename).openInputStream().getInputStream();
164-
BufferedReader reader = new BufferedReader(new InputStreamReader(dataLakeIn))) {
165-
String line;
166-
while ( (line = reader.readLine()) != null) {
146+
InputStream in = client.getReadStream(filename);
147+
BufferedReader reader = new BufferedReader(new InputStreamReader(in));
148+
String line;
149+
while ( (line = reader.readLine()) != null) {
167150
System.out.println(line);
168151
}
169152
reader.close();
@@ -173,20 +156,12 @@ System.out.println("File contents read.");
173156

174157
## Concatenate files
175158

176-
The following snippet concatenates two files in a Data Lake Storage Gen2 account. If successful, the concatenated file replaces the two existing files.
159+
The following snippet concatenates two files in a Data Lake Storage Gen1 account. If successful, the concatenated file replaces the two existing files.
177160

178161
```java
179162
// concatenate the two files into one
180-
dataLakeFileClient = dataLakeDirectoryClient.createFile("/a/b/f.txt", true);
181163
List<String> fileList = Arrays.asList("/a/b/c.txt", "/a/b/d.txt");
182-
fileList.stream().forEach(filename -> {
183-
File concatenateFile = new File(filename);
184-
try (InputStream fileIn = new FileInputStream(concatenateFile)) {
185-
dataLakeFileClient.append(fileIn, 0, concatenateFile.length());
186-
} catch (IOException e) {
187-
e.printStackTrace();
188-
}
189-
});
164+
client.concatenateFiles("/a/b/f.txt", fileList);
190165
System.out.println("Two files concatenated into a new file.");
191166
```
192167

@@ -196,7 +171,7 @@ The following snippet renames a file in a Data Lake Storage Gen1 account.
196171

197172
```java
198173
//rename the file
199-
dataLakeFileSystemClient.getFileClient("a/b/f.txt").rename(dataLakeFileSystemClient.getFileSystemName(), "a/b/g.txt");
174+
client.rename("/a/b/f.txt", "/a/b/g.txt");
200175
System.out.println("New file renamed.");
201176
```
202177

@@ -206,8 +181,8 @@ The following snippet retrieves the metadata for a file in a Data Lake Storage G
206181

207182
```java
208183
// get file metadata
209-
Map<String, String> metaData = dataLakeFileSystemClient.getFileClient(filename).getProperties().getMetadata();
210-
printDirectoryInfo(metaData);
184+
DirectoryEntry ent = client.getDirectoryEntry(filename);
185+
printDirectoryInfo(ent);
211186
System.out.println("File metadata retrieved.");
212187
```
213188

@@ -217,8 +192,7 @@ The following snippet sets permissions on the file that you created in the previ
217192

218193
```java
219194
// set file permission
220-
PathAccessControl pathAccessControl = dataLakeFileSystemClient.getFileClient(filename).getAccessControl();
221-
dataLakeFileSystemClient.getFileClient(filename).setPermissions(PathPermissions.parseOctal("744"), pathAccessControl.getGroup(), pathAccessControl.getOwner());
195+
client.setPermission(filename, "744");
222196
System.out.println("File permission set.");
223197
```
224198

@@ -228,9 +202,11 @@ The following snippet lists the contents of a directory, recursively.
228202

229203
```java
230204
// list directory contents
231-
dataLakeFileSystemClient.listPaths(new ListPathsOptions().setPath("a/b"), Duration.ofSeconds(2000)).forEach(path -> {
232-
printDirectoryInfo(dataLakeFileSystemClient.getDirectoryClient(path.getName()).getProperties().getMetadata());
233-
});
205+
List<DirectoryEntry> list = client.enumerateDirectory("/a/b", 2000);
206+
System.out.println("Directory listing for directory /a/b:");
207+
for (DirectoryEntry entry : list) {
208+
printDirectoryInfo(entry);
209+
}
234210
System.out.println("Directory contents listed.");
235211
```
236212

@@ -242,7 +218,7 @@ The following snippet deletes the specified files and folders in a Data Lake Sto
242218

243219
```java
244220
// delete directory along with all the subdirectories and files in it
245-
dataLakeFileSystemClient.deleteDirectory("a");
221+
client.deleteRecursive("/a");
246222
System.out.println("All files and folders deleted recursively");
247223
promptEnterKey();
248224
```
@@ -252,7 +228,6 @@ promptEnterKey();
252228
2. To produce a standalone jar that you can run from command-line build the jar with all dependencies included, using the [Maven assembly plugin](https://maven.apache.org/plugins/maven-assembly-plugin/usage.html). The pom.xml in the [example source code on GitHub](https://github.com/Azure-Samples/data-lake-store-java-upload-download-get-started/blob/master/pom.xml) has an example.
253229

254230
## Next steps
255-
* [Explore JavaDoc for the Java SDK](https://azure.github.io/azure-sdk-for-java/datalakestorage%28gen2%29.html)
256-
* [Secure data in Data Lake Storage Gen2](data-lake-store-secure-data.md)
257-
231+
* [Explore JavaDoc for the Java SDK](https://azure.github.io/azure-data-lake-store-java/javadoc/)
232+
* [Secure data in Data Lake Storage Gen1](data-lake-store-secure-data.md)
258233

0 commit comments

Comments
 (0)