You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Perform the following steps to order an import job in Azure Import/Export. The Azure Import/Export service will create a job of the type "Data Box."
45
+
Perform the following steps to order an import job in Azure Import/Export. The Azure Import/Export service creates a job of the type "Data Box."
46
46
47
47
1. Use your Microsoft Azure credentials to sign in at this URL: [https://portal.azure.com](https://portal.azure.com).
48
48
1. Select **+ Create a resource** and search for *Azure Data Box*. Select **Azure Data Box**.
@@ -104,7 +104,7 @@ Perform the following steps to order an import job in Azure Import/Export. The A
104
104
<Blob selections include a container, a blob, and blob prefixes that work like wildcards. The Add Prefixes pane on the right is used to add prefixes that select blobs based on common text in the blob path or name.>
105
105
:::image-end:::
106
106
107
-
- Choose **Export from blob list file (XML format)**, and select an XML file that contains a list of paths and prefixes for the blobs to be exported from the storage account. You must construct the XML file and store it in a container for the storage account. The file cannot be empty.
107
+
- Choose **Export from blob list file (XML format)**, and select an XML file that contains a list of paths and prefixes for the blobs to be exported from the storage account. You must construct the XML file and store it in a container for the storage account. The file can't be empty.
108
108
109
109
> [!IMPORTANT]
110
110
> If you use an XML file to select the blobs to export, make sure that the XML contains valid paths and/or prefixes. If the file is invalid or no data matches the paths specified, the order terminates with partial data or no data exported.
@@ -121,7 +121,7 @@ Perform the following steps to order an import job in Azure Import/Export. The A
121
121
1. In **Return shipping**:
122
122
123
123
1. Select a shipping carrier from the drop-down list for **Carrier**. The location of the Microsoft datacenter for the selected region determines which carriers are available.
124
-
1. Enter a **Carrier account number**. The account number for an valid carrier account is required.
124
+
1. Enter a **Carrier account number**. The account number for a valid carrier account is required.
125
125
1. In the **Return address** area, use **+ Add Address** to add the address to ship to.
126
126
127
127

If you do not know the number of drives you need, see [Determine how many drives you need](storage-import-export-determine-drives-for-export.md#determine-how-many-drives-you-need). If you know the number of drives, proceed to ship the drives.
327
+
If you don't know the number of drives you need, see [Determine how many drives you need](storage-import-export-determine-drives-for-export.md#determine-how-many-drives-you-need). If you know the number of drives, proceed to ship the drives.
@@ -336,8 +336,8 @@ If you do not know the number of drives you need, see [Determine how many drives
336
336
337
337
When the dashboard reports the job is complete, the disks are shipped to you and the tracking number for the shipment is available in the portal.
338
338
339
-
1. After you receive the drives with exported data, you need to get the BitLocker keys to unlock the drives. Go to the export job in the Azure portal. Click**Import/Export** tab.
340
-
2. Select and click your export job from the list. Go to **Encryption** and copy the keys.
339
+
1. After you receive the drives with exported data, you need to get the BitLocker keys to unlock the drives. Go to the export job in the Azure portal. Select**Import/Export** tab.
340
+
2. Select your export job from the list. Go to **Encryption** and copy the keys.
341
341
342
342

343
343
@@ -351,7 +351,7 @@ Use the following command to unlock the drive:
351
351
352
352
`WAImportExport Unlock /bk:<BitLocker key (base 64 string) copied from Encryption blade in Azure portal> /driveLetter:<Drive letter>`
Copy file name to clipboardExpand all lines: articles/import-export/storage-import-export-data-to-blobs.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -49,13 +49,13 @@ This step generates a journal file. The journal file stores basic information su
49
49
Perform the following steps to prepare the drives.
50
50
51
51
1. Connect your disk drives to the Windows system via SATA connectors.
52
-
2. Create a single NTFS volume on each drive. Assign a drive letter to the volume. Do not use mountpoints.
52
+
2. Create a single NTFS volume on each drive. Assign a drive letter to the volume. Don't use mountpoints.
53
53
3. Enable BitLocker encryption on the NTFS volume. If using a Windows Server system, use the instructions in [How to enable BitLocker on Windows Server 2012 R2](https://thesolving.com/storage/how-to-enable-bitlocker-on-windows-server-2012-r2/).
54
54
4. Copy data to encrypted volume. Use drag and drop or Robocopy or any such copy tool. A journal (*.jrn*) file is created in the same folder where you run the tool.
55
55
56
56
If the drive is locked and you need to unlock the drive, the steps to unlock may be different depending on your use case.
57
57
58
-
* If you have added data to a pre-encrypted drive (WAImportExport tool was not used for encryption), use the BitLocker key (a numerical password that you specify) in the popup to unlock the drive.
58
+
* If you have added data to a pre-encrypted drive (WAImportExport tool wasn't used for encryption), use the BitLocker key (a numerical password that you specify) in the popup to unlock the drive.
59
59
60
60
* If you have added data to a drive that was encrypted by WAImportExport tool, use the following command to unlock the drive:
61
61
@@ -85,9 +85,9 @@ Perform the following steps to prepare the drives.
85
85
|/bk: |The BitLocker key for the drive. Its numerical password from output of `manage-bde -protectors -get D:` |
86
86
|/srcdir: |The drive letter of the disk to be shipped followed by `:\`. For example, `D:\`. |
87
87
|/dstdir: |The name of the destination container in Azure Storage. |
88
-
|/blobtype: |This option specifies the type of blobs you want to import the data to. For block blobs, the blob type is `BlockBlob` and for page blobs, it is `PageBlob`. |
89
-
|/skipwrite: | Specifies that there is no new data required to be copied and existing data on the disk is to be prepared. |
90
-
|/enablecontentmd5: |The option when enabled, ensures that MD5 is computed and set as `Content-md5` property on each blob. Use this option only if you want to use the `Content-md5` field after the data is uploaded to Azure. <br> This option does not affect the data integrity check (that occurs by default). The setting does increase the time taken to upload data to cloud. |
88
+
|/blobtype: |This option specifies the type of blobs you want to import the data to. For block blobs, the blob type is `BlockBlob` and for page blobs, it's `PageBlob`. |
89
+
|/skipwrite: | Specifies that there's no new data required to be copied and existing data on the disk is to be prepared. |
90
+
|/enablecontentmd5: |The option when enabled, ensures that MD5 is computed and set as `Content-md5` property on each blob. Use this option only if you want to use the `Content-md5` field after the data is uploaded to Azure. <br> This option doesn't affect the data integrity check (that occurs by default). The setting does increase the time taken to upload data to cloud. |
91
91
92
92
> [!NOTE]
93
93
> - If you import a blob with the same name as an existing blob in the destination container, the imported blob will overwrite the existing blob. In earlier tool versions (before 1.5.0.300), the imported blob was renamed by default, and a \Disposition parameter let you specify whether to rename, overwrite, or disregard the blob in the import.
@@ -97,7 +97,7 @@ Perform the following steps to prepare the drives.
97
97
98
98
A journal file with the provided name is created for every run of the command line.
99
99
100
-
Together with the journal file, a `<Journal file name>_DriveInfo_<Drive serial ID>.xml` file is also created in the same folder where the tool resides. The .xml file is used in place of the journal file when creating a job if the journal file is too big.
100
+
Together with the journal file, a `<Journal file name>_DriveInfo_<Drive serial ID>.xml` file is also created in the same folder where the tool resides. The .xml file is used in place of the journal file when creating a job if the journal file is too large.
101
101
102
102
> [!IMPORTANT]
103
103
> * Do not modify the journal files or the data on the disk drives, and don't reformat any disks, after completing disk preparation.
# Tutorial: Transfer data to Azure Files with Azure Import/Export
13
13
14
14
This article provides step-by-step instructions on how to use the Azure Import/Export service to securely import large amounts of data into Azure Files. To import data, the service requires you to ship supported disk drives containing your data to an Azure datacenter.
15
15
16
-
The Import/Export service supports only import of Azure Files into Azure Storage. Exporting Azure Files is not supported.
16
+
The Import/Export service supports only import of Azure Files into Azure Storage. Exporting Azure Files isn't supported.
17
17
18
18
In this tutorial, you learn how to:
19
19
@@ -51,7 +51,7 @@ Do the following steps to prepare the drives.
51
51
2. Create a single NTFS volume on each drive. Assign a drive letter to the volume. Do not use mountpoints.
52
52
3. Modify the *dataset.csv* file in the root folder where the tool is. Depending on whether you want to import a file or folder or both, add entries in the *dataset.csv* file similar to the following examples.
53
53
54
-
-**To import a file**: In the following example, the data to copy is on the F: drive. Your file *MyFile1.txt* is copied to the root of the *MyAzureFileshare1*. If the *MyAzureFileshare1*does not exist, it's created in the Azure Storage account. Folder structure is maintained.
54
+
-**To import a file**: In the following example, the data to copy is on the F: drive. Your file *MyFile1.txt* is copied to the root of the *MyAzureFileshare1*. If the *MyAzureFileshare1*doesn't exist, it's created in the Azure Storage account. Folder structure is maintained.
55
55
56
56
```
57
57
BasePath,DstItemPathOrPrefix,ItemType
@@ -65,7 +65,7 @@ Do the following steps to prepare the drives.
65
65
```
66
66
67
67
> [!NOTE]
68
-
> The /Disposition parameter, which let you choose what to do when you import a file that already exists in earlier versions of the tool, is not supported in Azure Import/Export version 2.2.0.300. In the earlier tool versions, an imported file with the same name as an existing file was renamed by default.
68
+
> The /Disposition parameter, which let you choose what to do when you import a file that already exists in earlier versions of the tool, isn't supported in Azure Import/Export version 2.2.0.300. In the earlier tool versions, an imported file with the same name as an existing file was renamed by default.
69
69
70
70
Multiple entries can be made in the same file corresponding to folders or files that are imported.
71
71
@@ -78,7 +78,7 @@ Do the following steps to prepare the drives.
78
78
79
79
This example assumes that two disks are attached and basic NTFS volumes G:\ and H:\ are created. H:\is not encrypted while G: is already encrypted. The tool formats and encrypts the disk that hosts H:\ only (and not G:\).
80
80
81
-
- **For a disk that is not encrypted**: Specify *Encrypt* to enable BitLocker encryption on the disk.
81
+
- **For a disk that isn't encrypted**: Specify *Encrypt* to enable BitLocker encryption on the disk.
Copy file name to clipboardExpand all lines: includes/storage-import-export-update-job-tracking.md
+3-19Lines changed: 3 additions & 19 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ services: storage
6
6
7
7
ms.service: storage
8
8
ms.topic: include
9
-
ms.date: 11/18/2021
9
+
ms.date: 02/13/2023
10
10
ms.author: alkohli
11
11
ms.custom: include file
12
12
---
@@ -18,9 +18,9 @@ After you provide tracking details, the job status changes to Shipping, and the
18
18
> [!IMPORTANT]
19
19
> If the tracking number is not updated within 2 weeks of creating the job, the job expires.
20
20
21
-
### [Portal (Preview)](#tab/azure-portal-preview)
21
+
### [Portal](#tab/azure-portal-preview)
22
22
23
-
To complete the tracking information for a job that you created in the Preview portal, do these steps:
23
+
To complete the tracking information for a job that you created in the portal, do these steps:
24
24
25
25
1. Open the job in the [Azure portal/](https://portal.azure.com/).
26
26
1. On the **Overview** pane, scroll down to **Tracking information** and complete the entries:
@@ -36,22 +36,6 @@ You can track the job progress on the **Overview** pane. For a description of ea
36
36

37
37
38
38
39
-
### [Portal (Classic)](#tab/azure-portal-classic)
40
-
41
-
To complete the tracking information for a job that you created in the Classic portal, do these steps.
42
-
43
-
1. Open the job in the [Azure portal/](https://portal.azure.com/).
44
-
1. At the top of the ****For job to progress, provide the tracking information** to open the **Update status** pane. Then complete the entries:
45
-
46
-
1. Select the checkbox by **Mark as shipped**.
47
-
1. Provide the **Carrier** and **Tracking number**.
48
-
1. When you finish, select **Save**.
49
-
50
-

51
-
52
-
You can track the job progress on the **Overview** pane. For a description of each job state, go to [View your job status](../articles/import-export/storage-import-export-view-drive-status.md).
53
-
54
-
55
39
### [Azure CLI](#tab/azure-cli)
56
40
57
41
If you created your Azure Import/Export job using Azure CLI, open the job in the Azure portal to update tracking information. Azure CLI and Azure PowerShell create jobs in the classic Azure Import/Export service and hence create an Azure resource of the type "Import/Export job."
0 commit comments