Skip to content

Commit 4ea8f33

Browse files
committed
Minor textual changes
1 parent 8287284 commit 4ea8f33

File tree

2 files changed

+7
-7
lines changed

2 files changed

+7
-7
lines changed

articles/databox/data-box-heavy-migrate-spo.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -76,7 +76,7 @@ After you receive confirmation from the Azure data team that your data copy has
7676
For best performance and connectivity, we recommend that you create an Azure Virtual Machine (VM).
7777

7878
1. Sign into the Azure portal, and then [Create a virtual machine](../virtual-machines/windows/quick-create-portal.md).
79-
2. [Mount the Azure file share onto the VM](../storage/files/storage-how-to-use-files-windows.md).
79+
2. [Mount the Azure file share onto the VM](../storage/files/storage-how-to-use-files-windows.md#mount-the-azure-file-share-with-file-explorer).
8080
3. [Download the SharePoint Migration tool](http://spmtreleasescus.blob.core.windows.net/install/default.htm) and install it on your Azure VM.
8181
4. Start the SharePoint Migration Tool. Click **Sign in** and enter your Office 365 username and password.
8282
5. When prompted **Where is your data?**, select **File share**. Enter the path to your Azure file share where your data is located.

articles/storage/blobs/data-lake-storage-migrate-on-premises-HDFS-cluster.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ ms.component: data-lake-storage-gen2
1313

1414
# Use Azure Data Box to migrate data from an on-premises HDFS store to Azure Storage
1515

16-
You can migrate data from an on-premises HDFS store of your Hadoop cluster into Azure Storage (blob storage or Data Lake Storage Gen2) by using a Data Box device. You can choose from a 80-TB Data Box or a 770-TB Data Box Heavy.
16+
You can migrate data from an on-premises HDFS store of your Hadoop cluster into Azure Storage (blob storage or Data Lake Storage Gen2) by using a Data Box device. You can choose from an 80-TB Data Box or a 770-TB Data Box Heavy.
1717

1818
This article helps you complete these tasks:
1919

@@ -46,10 +46,10 @@ To copy the data from your on-premises HDFS store to a Data Box device, you'll s
4646

4747
If the amount of data that you are copying is more than the capacity of a single Data Box or that of single node on Data Box Heavy, break up your data set into sizes that do fit into your devices.
4848

49-
Follow these steps to copy data via the REST APIs of Blob/Object storage to your Data Box device. The REST API interface will make the device appear as a HDFS store to your cluster.
49+
Follow these steps to copy data via the REST APIs of Blob/Object storage to your Data Box device. The REST API interface will make the device appear as an HDFS store to your cluster.
5050

5151

52-
1. Before you copy the data via REST, identify the security and connection primitives to connect to the REST interface on the Data Box or Data Box Heavy. Sign in to the local web UI of Data Box and go to **Connect and copy** page. Against the Azure storage account for your device, under **Access settings**, locate and select **REST**.
52+
1. Before you copy the data via REST, identify the security and connection primitives to connect to the REST interface on the Data Box or Data Box Heavy. Sign in to the local web UI of Data Box and go to **Connect and copy** page. Against the Azure storage account for your device, under **Access settings**, locate, and select **REST**.
5353

5454
!["Connect and copy" page](media/data-lake-storage-migrate-on-premises-HDFS-cluster/data-box-connect-rest.png)
5555

@@ -66,7 +66,7 @@ Follow these steps to copy data via the REST APIs of Blob/Object storage to your
6666
```
6767
If you are using some other mechanism for DNS, you should ensure that the Data Box endpoint can be resolved.
6868
69-
4. Set a shell variable `azjars` to point to the `hadoop-azure` and the `microsoft-windowsazure-storage-sdk` jar files. These files are under the Hadoop installation directory (You can check if these files exist by using this command `ls -l $<hadoop_install_dir>/share/hadoop/tools/lib/ | grep azure` where `<hadoop_install_dir>` is the directory where you have installed Hadoop ) Use the full paths.
69+
4. Set a shell variable `azjars` to point to the `hadoop-azure` and the `microsoft-windowsazure-storage-sdk` jar files. These files are under the Hadoop installation directory (You can check if these files exist by using this command `ls -l $<hadoop_install_dir>/share/hadoop/tools/lib/ | grep azure` where `<hadoop_install_dir>` is the directory where you have installed Hadoop) Use the full paths.
7070
7171
```
7272
# azjars=$hadoop_install_dir/share/hadoop/tools/lib/hadoop-azure-2.6.0-cdh5.14.0.jar
@@ -118,7 +118,7 @@ Follow these steps to copy data via the REST APIs of Blob/Object storage to your
118118
119119
To improve the copy speed:
120120
- Try changing the number of mappers. (The above example uses `m` = 4 mappers.)
121-
- Try running mutliple `distcp` in parallel.
121+
- Try running multiple `distcp` in parallel.
122122
- Remember that large files perform better than small files.
123123
124124
## Ship the Data Box to Microsoft
@@ -141,7 +141,7 @@ Follow these steps to prepare and ship the Data Box device to Microsoft.
141141
142142
This step is needed if you are using Azure Data Lake Storage Gen2 as your data store. If you are using just a blob storage account without hierarchical namespace as your data store, you do not need to do this step.
143143
144-
You can do this in 2 ways.
144+
You can do this in two ways.
145145
146146
- Use [Azure Data Factory to move data to ADLS Gen2](https://docs.microsoft.com/azure/data-factory/load-azure-data-lake-storage-gen2). You will have to specify **Azure Blob Storage** as the source.
147147

0 commit comments

Comments
 (0)