You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/databox/data-box-deploy-copy-data-via-nfs.md
+73-31Lines changed: 73 additions & 31 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,12 +7,19 @@ author: stevenmatthew
7
7
ms.service: databox
8
8
ms.subservice: pod
9
9
ms.topic: tutorial
10
-
ms.date: 08/26/2022
10
+
ms.date: 03/25/2024
11
11
ms.author: shaas
12
12
#Customer intent: As an IT admin, I need to be able to copy data to Data Box to upload on-premises data from my server onto Azure.
13
13
---
14
14
# Tutorial: Copy data to Azure Data Box via NFS
15
15
16
+
> [!IMPORTANT]
17
+
> Azure Data Box now supports access tier assignment at the blob level. The steps contained within this tutorial reflect the updated data copy process and are specific to block blobs.
18
+
>
19
+
>For help with determining the appropriate access tier for your block blob data, refer to the [Determine appropriate access tiers for block blobs](#determine-appropriate-access-tiers-for-block-blobs) section. Follow the steps containined within the [Copy data to Azure Data Box](#copy-data-to-azure-data-box) section to copy your data to the appropriate access tier.
20
+
>
21
+
> The information contained within this section applies to orders placed after April 1, 2024.
22
+
16
23
This tutorial describes how to connect to and copy data from your host computer using the local web UI.
17
24
18
25
In this tutorial, you learn how to:
@@ -27,37 +34,51 @@ In this tutorial, you learn how to:
27
34
28
35
Before you begin, make sure that:
29
36
30
-
1. You have completed the [Tutorial: Set up Azure Data Box](data-box-deploy-set-up.md).
31
-
2. You have received your Data Box and the order status in the portal is **Delivered**.
32
-
3. You have a host computer that has the data that you want to copy over to Data Box. Your host computer must
37
+
1. You complete the [Tutorial: Set up Azure Data Box](data-box-deploy-set-up.md).
38
+
2. You receive your Data Box and the order status in the portal is **Delivered**.
39
+
3. You have a host computer that has the data that you want to copy over to Data Box. Your host computer must:
33
40
- Run a [Supported operating system](data-box-system-requirements.md).
34
-
- Be connected to a high-speed network. We strongly recommend that you have at least one 10-GbE connection. If a 10-GbE connection isn't available, a 1-GbE data link can be used but the copy speeds will be impacted.
41
+
- Be connected to a high-speed network. We strongly recommend that you have at least one 10-GbE connection. If a 10-GbE connection isn't available, a 1-GbE data link can be used but the copy speeds are impacted.
35
42
36
43
## Connect to Data Box
37
44
38
45
Based on the storage account selected, Data Box creates up to:
39
46
40
47
* Three shares for each associated storage account for GPv1 and GPv2.
41
48
* One share for premium storage.
42
-
* Four shares for a blob storage account.
49
+
* One share for a blob storage account, containing one folder for each of the four access tiers.
50
+
51
+
The following table identifies the names of the Data Box shares to which you can connect, and the type of data uploaded to your target storage account. It also identifies the hierarchy of shares and directories into which you copy your source data.
52
+
53
+
| Storage type | Share name | First-level entity | Second-level entity | Third-level entity |
You can't copy files directly to the *root* folder of any Data Box share. Instead, create folders within the Data Box share corresponding to your user case.
60
+
61
+
Block blobs support the assignment of access tiers at the file level. Before you copy files to the block blob share, the recommended best-practice is to add new subfolders within the appropriate access tier. Then, after creating new subfolders, continue adding files to each subfolder as appropriate.
62
+
63
+
A new container is created for any folder residing at the root of the block blob share. Any file within the folder is copied to the storage account's default access tier as a block blob.
64
+
65
+
For more information about blob access tiers, see [Access tiers for blob data](../storage/blobs/access-tiers-overview.md). For more detailed information about access tier best practices, see [Best practices for using blob access tiers](../storage/blobs/access-tiers-best-practices.md).
43
66
44
-
Under block blob and page blob shares, first-level entities are containers, and second-level entities are blobs. Under shares for Azure Files, first-level entities are shares, second-level entities are files.
67
+
The following table shows the UNC path to the shares on your Data Box and the corresponding Azure Storage path URL to which data is uploaded. The final Azure Storage path URL can be derived from the UNC share path.
45
68
46
-
The following table shows the UNC path to the shares on your Data Box and Azure Storage path URL where the data is uploaded. The final Azure Storage path URL can be derived from the UNC share path.
If you are using a Linux host computer, perform the following steps to configure Data Box to allow access to NFS clients.
75
+
If you're using a Linux host computer, perform the following steps to configure Data Box to allow access to NFS clients.
55
76
56
-
1. Supply the IP addresses of the allowed clients that can access the share. In the local web UI, go to **Connect and copy** page. Under **NFS settings**, click**NFS client access**.
77
+
1. Supply the IP addresses of the allowed clients that can access the share. In the local web UI, go to **Connect and copy** page. Under **NFS settings**, select**NFS client access**.
2. Supply the IP address of the NFS client and click**Add**. You can configure access for multiple NFS clients by repeating this step. Click**OK**.
81
+
2. Supply the IP address of the NFS client and select**Add**. You can configure access for multiple NFS clients by repeating this step. Select**OK**.
61
82
62
83

63
84
@@ -67,42 +88,62 @@ If you are using a Linux host computer, perform the following steps to configure
67
88
68
89
`sudo mount <Data Box device IP>:/<NFS share on Data Box device> <Path to the folder on local Linux computer>`
69
90
70
-
The following example shows how to connect via NFS to a Data Box share. The Data Box device IP is `10.161.23.130`, the share `Mystoracct_Blob` is mounted on the ubuntuVM, mount point being`/home/databoxubuntuhost/databox`.
91
+
Use the following example to connect to a Data Box share using NFS. In the example, the Data Box device IP is `10.161.23.130`. The share `Mystoracct_Blob` is mounted on the ubuntuVM, and the mount point is`/home/databoxubuntuhost/databox`.
71
92
72
93
`sudo mount -t nfs 10.161.23.130:/Mystoracct_Blob /home/databoxubuntuhost/databox`
73
94
74
-
For Mac clients, you will need to add an additional option as follows:
95
+
For Mac clients, you need to add an extra option as follows:
75
96
76
97
`sudo mount -t nfs -o sec=sys,resvport 10.161.23.130:/Mystoracct_Blob /home/databoxubuntuhost/databox`
77
98
78
99
79
100
> [!IMPORTANT]
80
101
> You can't copy files directly to the storage account's *root* folder. Within a block blob storage account's root folder, you'll find a folder corresponding to each of the available access tiers.
81
102
>
82
-
> To copy you data to Azure Data Box, you must first select the folder corresponding to one of the access tiers. Next, create a sub-folder within that tier's folder to store your data. Finally, copy your data to the newly created sub-folder. Your new sub-folder represents the container created within the storage account during ingestion. Your data is uploaded to this container as blobs.
103
+
> To copy your data to Azure Data Box, you must first select the folder corresponding to one of the access tiers. Next, create a sub-folder within that tier's folder to store your data. Finally, copy your data to the newly created sub-folder. Your new sub-folder represents the container created within the storage account during ingestion. Your data is uploaded to this container as blobs.
83
104
84
105
<!--**Always create a folder for the files that you intend to copy under the share and then copy the files to that folder**. The folder created under block blob and page blob shares represents a container to which data is uploaded as blobs. You cannot copy files directly to *root* folder in the storage account.-->
85
106
107
+
## Determine appropriate access tiers for block blobs
108
+
109
+
> [!IMPORTANT]
110
+
> The information contained within this section applies to orders placed after April 1<sup>st</sup>, 2024.
111
+
112
+
Azure Storage allows you to store block blob data in several access tiers within the same storage account. This ability allows data to be organized and stored more efficiently based on how often it's accessed. The following table contains information and recommendations about Azure Storage access tiers.
113
+
114
+
| Tier | Recommendation | Best practice |
115
+
|---------|----------------|---------------|
116
+
| Hot | Useful for online data accessed or modified frequently. This tier has the highest storage costs, but the lowest access costs. | Data in this tier should be in regular and active use. |
117
+
| Cool | Useful for online data accessed or modified infrequently. This tier has lower storage costs and higher access costs than the hot tier. | Data in this tier should be stored for at least 30 days. |
118
+
| Cold | Useful for online data accessed or modified rarely but still requiring fast retrieval. This tier has lower storage costs and higher access costs than the cool tier.| Data in this tier should be stored for a minimum of 90 days. |
119
+
| Archive | Useful for offline data rarely accessed and having lower latency requirements. | Data in this tier should be stored for a minimum of 180 days. Data removed from the archive tier within 180 days is subject to an early deletion charge. |
120
+
121
+
For more information about blob access tiers, see [Access tiers for blob data](../storage/blobs/access-tiers-overview.md). For more detailed best practices, see [Best practices for using blob access tiers](../storage/blobs/access-tiers-best-practices.md).
122
+
123
+
You can transfer your block blob data to the appropriate access tier by copying it to the corresponding folder within Data Box. This process is discussed in greater detail within the [Copy data to Azure Data Box](#copy-data-to-azure-data-box) section.
124
+
86
125
## Copy data to Data Box
87
126
88
-
Once you are connected to the Data Box shares, the next step is to copy data. Before you begin the data copy, review the following considerations:
127
+
After you connect to one or more Data Box shares, the next step is to copy data. Before you begin the data copy, consider the following limitations:
89
128
90
-
* Ensure that you copy the data to shares that correspond to the appropriate data format. For instance, copy the block blob data to the share for block blobs. Copy VHDs to page blobs. If the data format does not match the appropriate share type, then at a later step, the data upload to Azure will fail.
129
+
* Make sure that you copy your data to the share that corresponds to the required data format. For instance, copy block blob data to the share for block blobs. Copy VHDs to the page blob share. If the data format doesn't match the appropriate share type, the data upload to Azure fails during a later step.
130
+
* When copying data to the *AzFile* or *PageBlob* shares, first create a folder at the share's root, then copy files to that folder.
131
+
* When copying data to the *BlockBlob* share, create a subfolder within the desired access tier, then copy data to the newly created subfolder. The subfolder represents a container into which data is uploaded as blobs. You can't copy files directly to a share's *root* folder.
91
132
* While copying data, ensure that the data size conforms to the size limits described in the [Azure storage account size limits](data-box-limits.md#azure-storage-account-size-limits).
92
-
*If data, which is being uploaded by Data Box, is concurrently uploaded by other applications outside of Data Box, then this could result in upload job failures and data corruption.
133
+
*Simultaneous uploads by Data Box and another non-Data Box application could potentially result in upload job failures and data corruption.
93
134
* If you use both the SMB and NFS protocols for data copies, we recommend that you:
94
135
* Use different storage accounts for SMB and NFS.
95
136
* Don't copy the same data to the same end destination in Azure using both SMB and NFS. In these cases, the final outcome can't be determined.
96
137
* Although copying via both SMB and NFS in parallel can work, we don't recommend doing that as it's prone to human error. Wait until your SMB data copy is complete before you start an NFS data copy.
97
-
* When copying data to the block blob share, create a sub-folder within the desired access tier, then copy data to the newly created sub-folder. The sub-folder represents a container to which your data is uploaded as blobs. You cannot copy files directly to the *root* folder in the storage account.
138
+
* When copying data to the block blob share, create a subfolder within the desired access tier, then copy data to the newly created subfolder. The subfolder represents a container to which your data is uploaded as blobs. You can't copy files directly to the *root* folder in the storage account.
98
139
* If ingesting case-sensitive directory and file names from an NFS share to NFS on Data Box:
99
140
* The case is preserved in the name.
100
141
* The files are case-insensitive.
101
142
102
-
For example, if copying `SampleFile.txt` and `Samplefile.Txt`, the case will be preserved in the name when copied to Data Box but the second file will overwrite the first one, as these are considered the same file.
143
+
For example, if copying `SampleFile.txt` and `Samplefile.Txt`, the case is preserved in the name when copied to Data Box. However, because they're considered the same file, the last file uploaded overwrites the first file.
103
144
104
145
> [!IMPORTANT]
105
-
> Make sure that you maintain a copy of the source data until you can confirm that the Data Box has transferred your data into Azure Storage.
146
+
> Make sure that you maintain a copy of the source data until you can confirm that your data has been copied into Azure Storage.
106
147
107
148
If you're using a Linux host computer, use a copy utility similar to Robocopy. Some of the alternatives available in Linux are [`rsync`](https://rsync.samba.org/), [FreeFileSync](https://www.freefilesync.org/), [Unison](https://www.cis.upenn.edu/~bcpierce/unison/), or [Ultracopier](https://ultracopier.first-world.info/).
108
149
@@ -138,14 +179,14 @@ If using `rsync` option for a multi-threaded copy, follow these guidelines:
138
179
139
180
`cd /local_path/; find -L . -type f | parallel -j X rsync -za {} /mnt/databox/{}`
140
181
141
-
where j specifies the number of parallelization, X = number of parallel copies
182
+
where *j* specifies the number of parallelization, *X* = number of parallel copies
142
183
143
184
We recommend that you start with 16 parallel copies and increase the number of threads depending on the resources available.
144
185
145
186
> [!IMPORTANT]
146
187
> The following Linux file types are not supported: symbolic links, character files, block files, sockets, and pipes. These file types will result in failures during the **Prepare to ship** step.
147
188
148
-
During the copy process, if there are any errors, you will see a notification.
189
+
Notifications are displayed during the copy prowess to identify errors.
149
190
150
191

151
192
@@ -169,9 +210,10 @@ In this tutorial, you learned about Azure Data Box topics such as:
169
210
170
211
> [!div class="checklist"]
171
212
>
172
-
> * Prerequisites
173
-
> * Connect to Data Box
174
-
> * Copy data to Data Box
213
+
> * Data Box data copy prerequisites
214
+
> * Connecting to Data Box
215
+
> * Determining appropriate access tiers for block blobs
216
+
> * Copying data to Data Box
175
217
176
218
Advance to the next tutorial to learn how to ship your Data Box back to Microsoft.
0 commit comments