You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- You can specify which containers and blobs to export.
157
-
- **To specify a blob to export**: Use the **Equal To** selector. Specify the relative path to the blob, beginning with the container name. Use *$root* to specify the root container.
158
-
- **To specify all blobs starting with a prefix**: Use the **Starts With** selector. Specify the prefix, beginning with a forward slash '/'. The prefix may be the prefix of the container name, the complete container name, or the complete container name followed by the prefix of the blob name. You must provide the blob paths in valid format to avoid errors during processing, as shown in this screenshot. For more information, see [Examples of valid blob paths](#examples-of-valid-blob-paths).
159
-
160
-

161
-
162
-
- You can export from the blob list file.
163
-
164
-

165
-
166
-
> [!NOTE]
167
-
> If the blob to be exported is in use during data copy, Azure Import/Export service takes a snapshot of the blob and copies the snapshot.
168
-
169
-
6. In **Return shipping info**:
170
-
171
-
- Select the carrier from the dropdown list. If you want to use a carrier other than FedEx/DHL, choose an existing option from the dropdown. Contact Azure Data Box Operations team at `[email protected]` with the information regarding the carrier you plan to use.
172
-
- Enter a valid carrier account number that you have created with that carrier. Microsoft uses this account to ship the drives back to you once your export job is complete.
173
-
- Provide a complete and valid contact name, phone, email, street address, city, zip, state/province, and country/region.
174
-
175
-
> [!TIP]
176
-
> Instead of specifying an email address for a single user, provide a group email. This ensures that you receive notifications even if an admin leaves.
177
-
178
-
7. In **Summary**:
179
-
180
-
- Review the details of the job.
181
-
- Make a note of the job name and provided Azure datacenter shipping address for shipping disks to Azure.
182
-
183
-
> [!NOTE]
184
-
> Always send the disks to the datacenter noted in the Azure portal. If the disks are shipped to the wrong datacenter, the job will not be processed.
185
-
186
-
- Click **OK** to complete export job creation.
187
-
-->
188
-
189
134
### [Azure CLI](#tab/azure-cli)
190
135
191
136
Use the following steps to create an export job in the Azure portal.
If you do not know the number of drives you need, see [Determine how many drives you need](storage-import-export-determine-drives-for-export.md#determine-how-many-drives-you-need). If you know the number of drives, proceed to ship the drives.
@@ -390,87 +333,6 @@ Here is an example of the sample input.
390
333
391
334
At this time, you can delete the job or leave it. Jobs automatically get deleted after 90 days.
392
335
393
-
<!--## Check the number of drives
394
-
395
-
This *optional* step helps you determine the number of drives required for the export job. Perform this step on a Windows system running a [Supported OS version](storage-import-export-requirements.md#supported-operating-systems).
396
-
397
-
398
-
This *optional* step helps you determine the number of drives required for the export job. Perform this step on a Windows system running a [Supported OS version](storage-import-export-requirements.md#supported-operating-systems).
399
-
400
-
1. [Download the WAImportExport version 1](https://www.microsoft.com/download/details.aspx?id=42659) on the Windows system.
401
-
2. Unzip to the default folder `waimportexportv1`. For example, `C:\WaImportExportV1`.
402
-
3. Open a PowerShell or command-line window with administrative privileges. To change directory to the unzipped folder, run the following command:
403
-
404
-
`cd C:\WaImportExportV1`
405
-
406
-
4. To check the number of disks required for the selected blobs, run the following command:
407
-
408
-
`WAImportExport.exe PreviewExport /ExportBlobListFile:<Path to XML blob list file> /DriveSize:<Size of drives used>`
409
-
410
-
The parameters are described in the following table:
411
-
412
-
|Command-line parameter|Description|
413
-
|--------------------------|-----------------|
414
-
|**/logdir:**|Optional. The log directory. Verbose log files are written to this directory. If not specified, the current directory is used as the log directory.|
415
-
|**/ExportBlobListFile:**|Required. Path to the XML file containing list of blob paths or blob path prefixes for the blobs to be exported. The file format used in the `BlobListBlobPath` element in the [Put Job](/rest/api/storageimportexport/jobs) operation of the Import/Export service REST API.|
416
-
|**/DriveSize:**|Required. The size of drives to use for an export job, *for example*, 500 GB, 1.5 TB.|
417
-
418
-
See an [Example of the PreviewExport command](#example-of-previewexport-command).
419
-
420
-
5. Check that you can read/write to the drives that will be shipped for the export job.
421
-
422
-
### Example of PreviewExport command
423
-
424
-
The following example demonstrates the `PreviewExport` command:
The export blob list file may contain blob names and blob prefixes, as shown here:
431
-
432
-
```xml
433
-
<?xml version="1.0" encoding="utf-8"?>
434
-
<BlobList>
435
-
<BlobPath>pictures/animals/koala.jpg</BlobPath>
436
-
<BlobPathPrefix>/vhds/</BlobPathPrefix>
437
-
<BlobPathPrefix>/movies/</BlobPathPrefix>
438
-
</BlobList>
439
-
```
440
-
441
-
The Azure Import/Export Tool lists all blobs to be exported and calculates how to pack them into drives of the specified size, taking into account any necessary overhead, then estimates the number of drives needed to hold the blobs and drive usage information.
442
-
443
-
Here is an example of the output, with informational logs omitted:
444
-
445
-
```powershell
446
-
Number of unique blob paths/prefixes: 3
447
-
Number of duplicate blob paths/prefixes: 0
448
-
Number of nonexistent blob paths/prefixes: 1
449
-
450
-
Drive size: 500.00 GB
451
-
Number of blobs that can be exported: 6
452
-
Number of blobs that cannot be exported: 2
453
-
Number of drives needed: 3
454
-
Drive #1: blobs = 1, occupied space = 454.74 GB
455
-
Drive #2: blobs = 3, occupied space = 441.37 GB
456
-
Drive #3: blobs = 2, occupied space = 131.28 GB
457
-
```
458
-
459
-
## Examples of valid blob paths
460
-
461
-
The following table shows examples of valid blob paths:
462
-
463
-
| Selector | Blob Path | Description |
464
-
| --- | --- | --- |
465
-
| Starts With |/ |Exports all blobs in the storage account |
466
-
| Starts With |/$root/ |Exports all blobs in the root container |
467
-
| Starts With |/book |Exports all blobs in any container that begins with prefix **book** |
468
-
| Starts With |/music/ |Exports all blobs in container **music** |
469
-
| Starts With |/music/love |Exports all blobs in container **music** that begin with prefix **love** |
470
-
| Equal To |$root/logo.bmp |Exports blob **logo.bmp** in the root container |
471
-
| Equal To |videos/story.mp4 |Exports blob **story.mp4** in container **videos** |
472
-
-->
473
-
474
336
## Next steps
475
337
476
338
*[View the job and drive status](storage-import-export-view-drive-status.md)
* Have an active Azure subscription that can be used for the Import/Export service.
34
34
* Have at least one Azure Storage account with a storage container. See the list of [Supported storage accounts and storage types for Import/Export service](storage-import-export-requirements.md).
35
35
* For information on creating a new storage account, see [How to Create a Storage Account](../storage/common/storage-account-create.md).
36
-
* For information on storage container, go to [Create a storage container](../storage/blobs/storage-quickstart-blobs-portal.md#create-a-container).
37
-
* Have adequate number of disks of [Supported types](storage-import-export-requirements.md#supported-disks).
38
-
* Have a Windows system running a [Supported OS version](storage-import-export-requirements.md#supported-operating-systems).
36
+
* For information on creating storage containers, go to [Create a storage container](../storage/blobs/storage-quickstart-blobs-portal.md#create-a-container).
37
+
* Have adequate number of disks of [supported types](storage-import-export-requirements.md#supported-disks).
38
+
* Have a Windows system running a [supported OS version](storage-import-export-requirements.md#supported-operating-systems).
39
39
* Enable BitLocker on the Windows system. See [How to enable BitLocker](https://thesolving.com/storage/how-to-enable-bitlocker-on-windows-server-2012-r2/).
40
-
*[Download the latest WAImportExport version 1](https://www.microsoft.com/download/details.aspx?id=42659) on the Windows system. The latest version of the tool has security updates to allow an external protector for the BitLocker key, and the updated unlock mode feature.
41
-
42
-
* Unzip to the default folder `waimportexportv1`. For example, `C:\WaImportExportV1`.
40
+
* Download the current release of the Azure Import/Export version 1 tool, for blobs, on the Windows system:
41
+
1.[Download WAImportExport version 1](https://www.microsoft.com/download/details.aspx?id=42659). The current version is 1.5.0.300.
42
+
1. Unzip to the default folder `WaImportExportV1`. For example, `C:\WaImportExportV1`.
43
43
* Have a FedEx/DHL account. If you want to use a carrier other than FedEx/DHL, contact Azure Data Box Operations team at `[email protected]`.
44
44
* The account must be valid, should have balance, and must have return shipping capabilities.
45
45
* Generate a tracking number for the export job.
@@ -95,6 +95,9 @@ Perform the following steps to prepare the drives.
95
95
|/skipwrite: | Specifies that there is no new data required to be copied and existing data on the disk is to be prepared. |
96
96
|/enablecontentmd5: |The option when enabled, ensures that MD5 is computed and set as `Content-md5` property on each blob. Use this option only if you want to use the `Content-md5` field after the data is uploaded to Azure. <br> This option does not affect the data integrity check (that occurs by default). The setting does increase the time taken to upload data to cloud. |
97
97
98
+
> [!NOTE]
99
+
> If you import a blob with the same name as an existing blob in the destination container, the imported blob will overwrite the existing blob. In earlier tool versions (before 1.5.0.300), the imported blob was renamed by default, and a \Disposition parameter let you specify whether to rename, overwrite, or disregard the blob in the import.
100
+
98
101
8. Repeat the previous step for each disk that needs to be shipped.
99
102
100
103
A journal file with the provided name is created for every run of the command line.
@@ -360,7 +363,7 @@ Skip this step and go to the next step if you want to use the Microsoft managed
360
363
361
364
## Step 6: Verify data upload to Azure
362
365
363
-
Track the job to completion. Once the job is complete, verify that your data has uploaded to Azure. Delete the on-premises data only after you have verified that upload was successful.
366
+
Track the job to completion. Once the job is complete, verify that your data has uploaded to Azure. Delete the on-premises data only after you have verified that the upload was successful. For more information, see [Review Import/Export copy logs](storage-import-export-tool-reviewing-job-status-v1.md).
0 commit comments