Skip to content

Commit 881bc69

Browse files
committed
feedback changes
1 parent 3022fa9 commit 881bc69

File tree

4 files changed

+22
-38
lines changed

4 files changed

+22
-38
lines changed

articles/import-export/storage-import-export-data-from-blobs.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ author: alkohli
55
services: storage
66
ms.service: azure-import-export
77
ms.topic: tutorial
8-
ms.date: 03/14/2022
8+
ms.date: 02/13/2023
99
ms.author: alkohli
1010
ms.custom: "tutorial, devx-track-azurepowershell, devx-track-azurecli, contperf-fy21q3"
1111
---
@@ -42,7 +42,7 @@ You must:
4242

4343
# [Portal](#tab/azure-portal-preview)
4444

45-
Perform the following steps to order an import job in Azure Import/Export. The Azure Import/Export service will create a job of the type "Data Box."
45+
Perform the following steps to order an import job in Azure Import/Export. The Azure Import/Export service creates a job of the type "Data Box."
4646

4747
1. Use your Microsoft Azure credentials to sign in at this URL: [https://portal.azure.com](https://portal.azure.com).
4848
1. Select **+ Create a resource** and search for *Azure Data Box*. Select **Azure Data Box**.
@@ -104,7 +104,7 @@ Perform the following steps to order an import job in Azure Import/Export. The A
104104
<Blob selections include a container, a blob, and blob prefixes that work like wildcards. The Add Prefixes pane on the right is used to add prefixes that select blobs based on common text in the blob path or name.>
105105
:::image-end:::
106106

107-
- Choose **Export from blob list file (XML format)**, and select an XML file that contains a list of paths and prefixes for the blobs to be exported from the storage account. You must construct the XML file and store it in a container for the storage account. The file cannot be empty.
107+
- Choose **Export from blob list file (XML format)**, and select an XML file that contains a list of paths and prefixes for the blobs to be exported from the storage account. You must construct the XML file and store it in a container for the storage account. The file can't be empty.
108108

109109
> [!IMPORTANT]
110110
> If you use an XML file to select the blobs to export, make sure that the XML contains valid paths and/or prefixes. If the file is invalid or no data matches the paths specified, the order terminates with partial data or no data exported.
@@ -121,7 +121,7 @@ Perform the following steps to order an import job in Azure Import/Export. The A
121121
1. In **Return shipping**:
122122

123123
1. Select a shipping carrier from the drop-down list for **Carrier**. The location of the Microsoft datacenter for the selected region determines which carriers are available.
124-
1. Enter a **Carrier account number**. The account number for an valid carrier account is required.
124+
1. Enter a **Carrier account number**. The account number for a valid carrier account is required.
125125
1. In the **Return address** area, use **+ Add Address** to add the address to ship to.
126126

127127
![Screenshot of the Return Shipping tab for an import job in Azure Data Box. The Return Shipping tab and the Plus Add Address button are highlighted.](./media/storage-import-export-data-from-blobs/import-export-order-preview-07-export-job.png)
@@ -324,7 +324,7 @@ Install-Module -Name Az.ImportExport
324324

325325
## Step 2: Ship the drives
326326

327-
If you do not know the number of drives you need, see [Determine how many drives you need](storage-import-export-determine-drives-for-export.md#determine-how-many-drives-you-need). If you know the number of drives, proceed to ship the drives.
327+
If you don't know the number of drives you need, see [Determine how many drives you need](storage-import-export-determine-drives-for-export.md#determine-how-many-drives-you-need). If you know the number of drives, proceed to ship the drives.
328328

329329
[!INCLUDE [storage-import-export-ship-drives](../../includes/storage-import-export-ship-drives.md)]
330330

@@ -336,8 +336,8 @@ If you do not know the number of drives you need, see [Determine how many drives
336336

337337
When the dashboard reports the job is complete, the disks are shipped to you and the tracking number for the shipment is available in the portal.
338338

339-
1. After you receive the drives with exported data, you need to get the BitLocker keys to unlock the drives. Go to the export job in the Azure portal. Click **Import/Export** tab.
340-
2. Select and click your export job from the list. Go to **Encryption** and copy the keys.
339+
1. After you receive the drives with exported data, you need to get the BitLocker keys to unlock the drives. Go to the export job in the Azure portal. Select **Import/Export** tab.
340+
2. Select your export job from the list. Go to **Encryption** and copy the keys.
341341

342342
![Screenshot of the Encryption blade for an export job in Azure Import Export Jobs. The Encryption menu item and Copy button for the key are highlighted.](./media/storage-import-export-data-from-blobs/export-from-blob-7.png)
343343

@@ -351,7 +351,7 @@ Use the following command to unlock the drive:
351351

352352
`WAImportExport Unlock /bk:<BitLocker key (base 64 string) copied from Encryption blade in Azure portal> /driveLetter:<Drive letter>`
353353

354-
Here is an example of the sample input.
354+
Here's an example of the sample input.
355355

356356
`WAImportExport.exe Unlock /bk:CAAcwBoAG8AdQBsAGQAIABiAGUAIABoAGkAZABkAGUAbgA= /driveLetter:e`
357357

articles/import-export/storage-import-export-data-to-blobs.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -49,13 +49,13 @@ This step generates a journal file. The journal file stores basic information su
4949
Perform the following steps to prepare the drives.
5050

5151
1. Connect your disk drives to the Windows system via SATA connectors.
52-
2. Create a single NTFS volume on each drive. Assign a drive letter to the volume. Do not use mountpoints.
52+
2. Create a single NTFS volume on each drive. Assign a drive letter to the volume. Don't use mountpoints.
5353
3. Enable BitLocker encryption on the NTFS volume. If using a Windows Server system, use the instructions in [How to enable BitLocker on Windows Server 2012 R2](https://thesolving.com/storage/how-to-enable-bitlocker-on-windows-server-2012-r2/).
5454
4. Copy data to encrypted volume. Use drag and drop or Robocopy or any such copy tool. A journal (*.jrn*) file is created in the same folder where you run the tool.
5555

5656
If the drive is locked and you need to unlock the drive, the steps to unlock may be different depending on your use case.
5757

58-
* If you have added data to a pre-encrypted drive (WAImportExport tool was not used for encryption), use the BitLocker key (a numerical password that you specify) in the popup to unlock the drive.
58+
* If you have added data to a pre-encrypted drive (WAImportExport tool wasn't used for encryption), use the BitLocker key (a numerical password that you specify) in the popup to unlock the drive.
5959

6060
* If you have added data to a drive that was encrypted by WAImportExport tool, use the following command to unlock the drive:
6161

@@ -85,9 +85,9 @@ Perform the following steps to prepare the drives.
8585
|/bk: |The BitLocker key for the drive. Its numerical password from output of `manage-bde -protectors -get D:` |
8686
|/srcdir: |The drive letter of the disk to be shipped followed by `:\`. For example, `D:\`. |
8787
|/dstdir: |The name of the destination container in Azure Storage. |
88-
|/blobtype: |This option specifies the type of blobs you want to import the data to. For block blobs, the blob type is `BlockBlob` and for page blobs, it is `PageBlob`. |
89-
|/skipwrite: | Specifies that there is no new data required to be copied and existing data on the disk is to be prepared. |
90-
|/enablecontentmd5: |The option when enabled, ensures that MD5 is computed and set as `Content-md5` property on each blob. Use this option only if you want to use the `Content-md5` field after the data is uploaded to Azure. <br> This option does not affect the data integrity check (that occurs by default). The setting does increase the time taken to upload data to cloud. |
88+
|/blobtype: |This option specifies the type of blobs you want to import the data to. For block blobs, the blob type is `BlockBlob` and for page blobs, it's `PageBlob`. |
89+
|/skipwrite: | Specifies that there's no new data required to be copied and existing data on the disk is to be prepared. |
90+
|/enablecontentmd5: |The option when enabled, ensures that MD5 is computed and set as `Content-md5` property on each blob. Use this option only if you want to use the `Content-md5` field after the data is uploaded to Azure. <br> This option doesn't affect the data integrity check (that occurs by default). The setting does increase the time taken to upload data to cloud. |
9191
9292
> [!NOTE]
9393
> - If you import a blob with the same name as an existing blob in the destination container, the imported blob will overwrite the existing blob. In earlier tool versions (before 1.5.0.300), the imported blob was renamed by default, and a \Disposition parameter let you specify whether to rename, overwrite, or disregard the blob in the import.
@@ -97,7 +97,7 @@ Perform the following steps to prepare the drives.
9797
9898
A journal file with the provided name is created for every run of the command line.
9999
100-
Together with the journal file, a `<Journal file name>_DriveInfo_<Drive serial ID>.xml` file is also created in the same folder where the tool resides. The .xml file is used in place of the journal file when creating a job if the journal file is too big.
100+
Together with the journal file, a `<Journal file name>_DriveInfo_<Drive serial ID>.xml` file is also created in the same folder where the tool resides. The .xml file is used in place of the journal file when creating a job if the journal file is too large.
101101
102102
> [!IMPORTANT]
103103
> * Do not modify the journal files or the data on the disk drives, and don't reformat any disks, after completing disk preparation.

articles/import-export/storage-import-export-data-to-files.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -5,15 +5,15 @@ author: alkohli
55
services: storage
66
ms.service: azure-import-export
77
ms.topic: tutorial
8-
ms.date: 02/01/2023
8+
ms.date: 02/13/2023
99
ms.author: alkohli
1010
ms.custom: "tutorial, devx-track-azurepowershell, devx-track-azurecli, contperf-fy21q3"
1111
---
1212
# Tutorial: Transfer data to Azure Files with Azure Import/Export
1313

1414
This article provides step-by-step instructions on how to use the Azure Import/Export service to securely import large amounts of data into Azure Files. To import data, the service requires you to ship supported disk drives containing your data to an Azure datacenter.
1515

16-
The Import/Export service supports only import of Azure Files into Azure Storage. Exporting Azure Files is not supported.
16+
The Import/Export service supports only import of Azure Files into Azure Storage. Exporting Azure Files isn't supported.
1717

1818
In this tutorial, you learn how to:
1919

@@ -51,7 +51,7 @@ Do the following steps to prepare the drives.
5151
2. Create a single NTFS volume on each drive. Assign a drive letter to the volume. Do not use mountpoints.
5252
3. Modify the *dataset.csv* file in the root folder where the tool is. Depending on whether you want to import a file or folder or both, add entries in the *dataset.csv* file similar to the following examples.
5353

54-
- **To import a file**: In the following example, the data to copy is on the F: drive. Your file *MyFile1.txt* is copied to the root of the *MyAzureFileshare1*. If the *MyAzureFileshare1* does not exist, it's created in the Azure Storage account. Folder structure is maintained.
54+
- **To import a file**: In the following example, the data to copy is on the F: drive. Your file *MyFile1.txt* is copied to the root of the *MyAzureFileshare1*. If the *MyAzureFileshare1* doesn't exist, it's created in the Azure Storage account. Folder structure is maintained.
5555

5656
```
5757
BasePath,DstItemPathOrPrefix,ItemType
@@ -65,7 +65,7 @@ Do the following steps to prepare the drives.
6565
```
6666
6767
> [!NOTE]
68-
> The /Disposition parameter, which let you choose what to do when you import a file that already exists in earlier versions of the tool, is not supported in Azure Import/Export version 2.2.0.300. In the earlier tool versions, an imported file with the same name as an existing file was renamed by default.
68+
> The /Disposition parameter, which let you choose what to do when you import a file that already exists in earlier versions of the tool, isn't supported in Azure Import/Export version 2.2.0.300. In the earlier tool versions, an imported file with the same name as an existing file was renamed by default.
6969
7070
Multiple entries can be made in the same file corresponding to folders or files that are imported.
7171
@@ -78,7 +78,7 @@ Do the following steps to prepare the drives.
7878
7979
This example assumes that two disks are attached and basic NTFS volumes G:\ and H:\ are created. H:\is not encrypted while G: is already encrypted. The tool formats and encrypts the disk that hosts H:\ only (and not G:\).
8080
81-
- **For a disk that is not encrypted**: Specify *Encrypt* to enable BitLocker encryption on the disk.
81+
- **For a disk that isn't encrypted**: Specify *Encrypt* to enable BitLocker encryption on the disk.
8282
8383
```
8484
DriveLetter,FormatOption,SilentOrPromptOnFormat,Encryption,ExistingBitLockerKey

includes/storage-import-export-update-job-tracking.md

Lines changed: 3 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ services: storage
66

77
ms.service: storage
88
ms.topic: include
9-
ms.date: 11/18/2021
9+
ms.date: 02/13/2023
1010
ms.author: alkohli
1111
ms.custom: include file
1212
---
@@ -18,9 +18,9 @@ After you provide tracking details, the job status changes to Shipping, and the
1818
> [!IMPORTANT]
1919
> If the tracking number is not updated within 2 weeks of creating the job, the job expires.
2020
21-
### [Portal (Preview)](#tab/azure-portal-preview)
21+
### [Portal](#tab/azure-portal-preview)
2222

23-
To complete the tracking information for a job that you created in the Preview portal, do these steps:
23+
To complete the tracking information for a job that you created in the portal, do these steps:
2424

2525
1. Open the job in the [Azure portal/](https://portal.azure.com/).
2626
1. On the **Overview** pane, scroll down to **Tracking information** and complete the entries:
@@ -36,22 +36,6 @@ You can track the job progress on the **Overview** pane. For a description of ea
3636
![Screenshot showing status tracking on the Overview pane for an Azure Import Export job in the Preview portal.](./media/storage-import-export-update-job-tracking/import-export-order-tracking-info-02.png)
3737

3838

39-
### [Portal (Classic)](#tab/azure-portal-classic)
40-
41-
To complete the tracking information for a job that you created in the Classic portal, do these steps.
42-
43-
1. Open the job in the [Azure portal/](https://portal.azure.com/).
44-
1. At the top of the ****For job to progress, provide the tracking information** to open the **Update status** pane. Then complete the entries:
45-
46-
1. Select the checkbox by **Mark as shipped**.
47-
1. Provide the **Carrier** and **Tracking number**.
48-
1. When you finish, select **Save**.
49-
50-
![!Screenshot of tracking information on the Overview pane for an Azure Import Export job as it appears in the Classic portal. Current job status, Tracking Information area, and Update button are highlighted.](./media/storage-import-export-update-job-tracking/import-export-order-tracking-info-classic-01.png)
51-
52-
You can track the job progress on the **Overview** pane. For a description of each job state, go to [View your job status](../articles/import-export/storage-import-export-view-drive-status.md).
53-
54-
5539
### [Azure CLI](#tab/azure-cli)
5640

5741
If you created your Azure Import/Export job using Azure CLI, open the job in the Azure portal to update tracking information. Azure CLI and Azure PowerShell create jobs in the classic Azure Import/Export service and hence create an Azure resource of the type "Import/Export job."

0 commit comments

Comments
 (0)