Skip to content

Commit 6a84f5f

Browse files
authored
Merge pull request #89422 from mhopkins-msft/list-blobs-js
Upadated code snippet
2 parents 95178ea + 0aa409b commit 6a84f5f

File tree

1 file changed

+88
-29
lines changed

1 file changed

+88
-29
lines changed

articles/storage/blobs/storage-quickstart-blobs-nodejs-v10.md

Lines changed: 88 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ description: Create, upload, and delete blobs and containers in Node.js with Azu
44
author: mhopkins-msft
55

66
ms.author: mhopkins
7-
ms.date: 11/14/2018
7+
ms.date: 09/24/2019
88
ms.service: storage
99
ms.subservice: blobs
1010
ms.topic: quickstart
@@ -47,6 +47,7 @@ npm install
4747
```
4848

4949
## Run the sample
50+
5051
Now that the dependencies are installed, you can run the sample by issuing the following command:
5152

5253
```bash
@@ -56,10 +57,11 @@ npm start
5657
The output from the app will be similar to the following example:
5758

5859
```bash
60+
Container "demo" is created
5961
Containers:
6062
- container-one
6163
- container-two
62-
Container "demo" is created
64+
- demo
6365
Blob "quickstart.txt" is uploaded
6466
Local file "./readme.md" is uploaded
6567
Blobs in "demo" container:
@@ -71,9 +73,11 @@ Blob "quickstart.txt" is deleted
7173
Container "demo" is deleted
7274
Done
7375
```
74-
If you're using a new storage account for this quickstart, then you may not see container names listed under the label "*Containers*".
76+
77+
If you're using a new storage account for this quickstart, then you may only see the *demo* container listed under the label "*Containers:*".
7578

7679
## Understanding the code
80+
7781
The sample begins by importing a number of classes and functions from the Azure Blob storage namespace. Each of the imported items is discussed in context as they're used in the sample.
7882

7983
```javascript
@@ -119,14 +123,18 @@ const STORAGE_ACCOUNT_NAME = process.env.AZURE_STORAGE_ACCOUNT_NAME;
119123
const ACCOUNT_ACCESS_KEY = process.env.AZURE_STORAGE_ACCOUNT_ACCESS_KEY;
120124
```
121125
The next set of constants helps to reveal the intent of file size calculations during upload operations.
126+
122127
```javascript
123128
const ONE_MEGABYTE = 1024 * 1024;
124129
const FOUR_MEGABYTES = 4 * ONE_MEGABYTE;
125130
```
131+
126132
Requests made by the API can be set to time-out after a given interval. The [Aborter](/javascript/api/%40azure/storage-blob/aborter?view=azure-node-preview) class is responsible for managing how requests are timed-out and the following constant is used to define timeouts used in this sample.
133+
127134
```javascript
128135
const ONE_MINUTE = 60 * 1000;
129136
```
137+
130138
### Calling code
131139

132140
To support JavaScript's *async/await* syntax, all the calling code is wrapped in a function named *execute*. Then *execute* is called and handled as a promise.
@@ -138,6 +146,7 @@ async function execute() {
138146

139147
execute().then(() => console.log("Done")).catch((e) => console.log(e));
140148
```
149+
141150
All of the following code runs inside the execute function where the `// commands...` comment is placed.
142151

143152
First, the relevant variables are declared to assign names, sample content and to point to the local file to upload to Blob storage.
@@ -156,6 +165,7 @@ const credentials = new SharedKeyCredential(STORAGE_ACCOUNT_NAME, ACCOUNT_ACCESS
156165
const pipeline = StorageURL.newPipeline(credentials);
157166
const serviceURL = new ServiceURL(`https://${STORAGE_ACCOUNT_NAME}.blob.core.windows.net`, pipeline);
158167
```
168+
159169
The following classes are used in this block of code:
160170

161171
- The [SharedKeyCredential](/javascript/api/%40azure/storage-blob/sharedkeycredential?view=azure-node-preview) class is responsible for wrapping storage account credentials to provide them to a request pipeline.
@@ -170,6 +180,7 @@ The instance of *ServiceURL* is used with the [ContainerURL](/javascript/api/%40
170180
const containerURL = ContainerURL.fromServiceURL(serviceURL, containerName);
171181
const blockBlobURL = BlockBlobURL.fromContainerURL(containerURL, blobName);
172182
```
183+
173184
The *containerURL* and *blockBlobURL* variables are reused throughout the sample to act on the storage account.
174185

175186
At this point, the container doesn't exist in the storage account. The instance of *ContainerURL* represents a URL that you can act upon. By using this instance, you can create and delete the container. The location of this container equates to a location such as this:
@@ -183,15 +194,28 @@ The *blockBlobURL* is used to manage individual blobs, allowing you to upload, d
183194
```bash
184195
https://<ACCOUNT_NAME>.blob.core.windows.net/demo/quickstart.txt
185196
```
197+
186198
As with the container, the block blob doesn't exist yet. The *blockBlobURL* variable is used later to create the blob by uploading content.
187199

200+
### Create a container
201+
202+
To create a container, the *ContainerURL*'s *create* method is used.
203+
204+
```javascript
205+
await containerURL.create(aborter);
206+
console.log(`Container: "${containerName}" is created`);
207+
```
208+
209+
As the name of the container is defined when calling *ContainerURL.fromServiceURL(serviceURL, containerName)*, calling the *create* method is all that's required to create the container.
210+
188211
### Using the Aborter class
189212

190213
Requests made by the API can be set to time-out after a given interval. The *Aborter* class is responsible for managing how requests are timed out. The following code creates a context where a set of requests is given 30 minutes to execute.
191214

192215
```javascript
193216
const aborter = Aborter.timeout(30 * ONE_MINUTE);
194217
```
218+
195219
Aborters give you control over requests by allowing you to:
196220

197221
- designate the amount of time given for a batch of requests
@@ -200,54 +224,54 @@ Aborters give you control over requests by allowing you to:
200224
- use the *Aborter.none* static member to stop your requests from timing out all together
201225

202226
### Show container names
227+
203228
Accounts can store a vast number of containers. The following code demonstrates how to list containers in a segmented fashion, which allows you to cycle through a large number of containers. The *showContainerNames* function is passed instances of *ServiceURL* and *Aborter*.
204229

205230
```javascript
206231
console.log("Containers:");
207232
await showContainerNames(serviceURL, aborter);
208233
```
234+
209235
The *showContainerNames* function uses the *listContainersSegment* method to request batches of container names from the storage account.
236+
210237
```javascript
211238
async function showContainerNames(aborter, serviceURL) {
212-
213-
let response;
214-
let marker;
239+
let marker = undefined;
215240

216241
do {
217-
response = await serviceURL.listContainersSegment(aborter, marker);
218-
marker = response.marker;
219-
for(let container of response.containerItems) {
242+
const listContainersResponse = await serviceURL.listContainersSegment(aborter, marker);
243+
marker = listContainersResponse.nextMarker;
244+
for(let container of listContainersResponse.containerItems) {
220245
console.log(` - ${ container.name }`);
221246
}
222247
} while (marker);
223248
}
224249
```
225-
When the response is returned, then the *containerItems* are iterated to log the name to the console.
226250

227-
### Create a container
228-
229-
To create a container, the *ContainerURL*'s *create* method is used.
230-
231-
```javascript
232-
await containerURL.create(aborter);
233-
console.log(`Container: "${containerName}" is created`);
234-
```
235-
As the name of the container is defined when calling *ContainerURL.fromServiceURL(serviceURL, containerName)*, calling the *create* method is all that's required to create the container.
251+
When the response is returned, then the *containerItems* are iterated to log the name to the console.
236252

237253
### Upload text
254+
238255
To upload text to the blob, use the *upload* method.
256+
239257
```javascript
240258
await blockBlobURL.upload(aborter, content, content.length);
241259
console.log(`Blob "${blobName}" is uploaded`);
242260
```
261+
243262
Here the text and its length are passed into the method.
263+
244264
### Upload a local file
265+
245266
To upload a local file to the container, you need a container URL and the path to the file.
267+
246268
```javascript
247269
await uploadLocalFile(aborter, containerURL, localFilePath);
248270
console.log(`Local file "${localFilePath}" is uploaded`);
249271
```
272+
250273
The *uploadLocalFile* function calls the *uploadFileToBlockBlob* function, which takes the file path and an instance of the destination of the block blob as arguments.
274+
251275
```javascript
252276
async function uploadLocalFile(aborter, containerURL, filePath) {
253277

@@ -259,16 +283,20 @@ async function uploadLocalFile(aborter, containerURL, filePath) {
259283
return await uploadFileToBlockBlob(aborter, filePath, blockBlobURL);
260284
}
261285
```
286+
262287
### Upload a stream
288+
263289
Uploading streams is also supported. This sample opens a local file as a stream to pass to the upload method.
290+
264291
```javascript
265292
await uploadStream(containerURL, localFilePath, aborter);
266293
console.log(`Local file "${localFilePath}" is uploaded as a stream`);
267294
```
295+
268296
The *uploadStream* function calls *uploadStreamToBlockBlob* to upload the stream to the storage container.
297+
269298
```javascript
270299
async function uploadStream(aborter, containerURL, filePath) {
271-
272300
filePath = path.resolve(filePath);
273301

274302
const fileName = path.basename(filePath).replace('.md', '-stream.md');
@@ -291,51 +319,82 @@ async function uploadStream(aborter, containerURL, filePath) {
291319
uploadOptions.maxBuffers);
292320
}
293321
```
322+
294323
During an upload, *uploadStreamToBlockBlob* allocates buffers to cache data from the stream in case a retry is necessary. The *maxBuffers* value designates at most how many buffers are used as each buffer creates a separate upload request. Ideally, more buffers equate to higher speeds, but at the cost of higher memory usage. The upload speed plateaus when the number of buffers is high enough that the bottleneck transitions to the network or disk instead of the client.
295324

296325
### Show blob names
297-
Just as accounts can contain many containers, each container can potentially contain a vast amount of blobs. Access to each blob in a container are available via an instance of the *ContainerURL* class.
326+
327+
Just as accounts can contain many containers, each container can potentially contain a vast amount of blobs. Access to each blob in a container are available via an instance of the *ContainerURL* class.
328+
298329
```javascript
299330
console.log(`Blobs in "${containerName}" container:`);
300331
await showBlobNames(aborter, containerURL);
301332
```
333+
302334
The function *showBlobNames* calls *listBlobFlatSegment* to request batches of blobs from the container.
335+
303336
```javascript
304337
async function showBlobNames(aborter, containerURL) {
305-
306-
let response;
307-
let marker;
338+
let marker = undefined;
308339

309340
do {
310-
response = await containerURL.listBlobFlatSegment(aborter);
311-
marker = response.marker;
312-
for(let blob of response.segment.blobItems) {
341+
const listBlobsResponse = await containerURL.listBlobFlatSegment(Aborter.none, marker);
342+
marker = listBlobsResponse.nextMarker;
343+
for (const blob of listBlobsResponse.segment.blobItems) {
313344
console.log(` - ${ blob.name }`);
314345
}
315346
} while (marker);
316347
}
317348
```
349+
318350
### Download a blob
351+
319352
Once a blob is created, you can download the contents by using the *download* method.
353+
320354
```javascript
321355
const downloadResponse = await blockBlobURL.download(aborter, 0);
322-
const downloadedContent = downloadResponse.readableStreamBody.read(content.length).toString();
356+
const downloadedContent = await streamToString(downloadResponse.readableStreamBody);
323357
console.log(`Downloaded blob content: "${downloadedContent}"`);
324358
```
325-
The response is returned as a stream. In this example, the stream is converted to a string to log to the console.
359+
360+
The response is returned as a stream. In this example, the stream is converted to a string by using the following *streamToString* helper function.
361+
362+
```javascript
363+
// A helper method used to read a Node.js readable stream into a string
364+
async function streamToString(readableStream) {
365+
return new Promise((resolve, reject) => {
366+
const chunks = [];
367+
readableStream.on("data", data => {
368+
chunks.push(data.toString());
369+
});
370+
readableStream.on("end", () => {
371+
resolve(chunks.join(""));
372+
});
373+
readableStream.on("error", reject);
374+
});
375+
}
376+
```
377+
326378
### Delete a blob
379+
327380
The *delete* method from a *BlockBlobURL* instance deletes a blob from the container.
381+
328382
```javascript
329383
await blockBlobURL.delete(aborter)
330384
console.log(`Block blob "${blobName}" is deleted`);
331385
```
386+
332387
### Delete a container
388+
333389
The *delete* method from a *ContainerURL* instance deletes a container from the storage account.
390+
334391
```javascript
335392
await containerURL.delete(aborter);
336393
console.log(`Container "${containerName}" is deleted`);
337394
```
395+
338396
## Clean up resources
397+
339398
All data written to the storage account is automatically deleted at the end of the code sample.
340399

341400
## Next steps

0 commit comments

Comments
 (0)