Skip to content

Commit 8eb91a6

Browse files
Merge pull request #215742 from kavarral/patch-11
Reviewed for freshness and fixed grammar
2 parents ed7655a + 5b38e63 commit 8eb91a6

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

articles/data-factory/load-azure-data-lake-storage-gen2-from-gen1.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ ms.service: data-factory
77
ms.subservice: data-movement
88
ms.topic: conceptual
99
ms.custom: seo-lt-2019
10-
ms.date: 08/06/2021
10+
ms.date: 10/25/2022
1111
---
1212

1313
# Copy data from Azure Data Lake Storage Gen1 to Gen2 with Azure Data Factory
@@ -121,7 +121,7 @@ ADF offers a serverless architecture that allows parallelism at different levels
121121

122122
Customers have successfully migrated petabytes of data consisting of hundreds of millions of files from Data Lake Storage Gen1 to Gen2, with a sustained throughput of 2 GBps and higher.
123123

124-
you can achieve great data movement speeds through different levels of parallelism:
124+
You can achieve greater data movement speeds by applying different levels of parallelism:
125125

126126
- A single copy activity can take advantage of scalable compute resources: when using Azure Integration Runtime, you can specify up to 256 [data integration units (DIUs)](copy-activity-performance-features.md#data-integration-units) for each copy activity in a serverless manner; when using self-hosted Integration Runtime, you can manually scale up the machine or scale out to multiple machines (up to 4 nodes), and a single copy activity will partition its file set across all nodes.
127127
- A single copy activity reads from and writes to the data store using multiple threads.
@@ -176,12 +176,12 @@ You can also enable [fault tolerance](copy-activity-fault-tolerance.md) in copy
176176

177177
### Permissions
178178

179-
In Data Factory, the [Data Lake Storage Gen1 connector](connector-azure-data-lake-store.md) supports service principal and managed identity for Azure resource authentications. The [Data Lake Storage Gen2 connector](connector-azure-data-lake-storage.md) supports account key, service principal, and managed identity for Azure resource authentications. To make Data Factory able to navigate and copy all the files or access control lists (ACLs) you need, grant high enough permissions for the account you provide to access, read, or write all files and set ACLs if you choose to. Grant it a super-user or owner role during the migration period.
179+
In Data Factory, the [Data Lake Storage Gen1 connector](connector-azure-data-lake-store.md) supports service principal and managed identity for Azure resource authentications. The [Data Lake Storage Gen2 connector](connector-azure-data-lake-storage.md) supports account key, service principal, and managed identity for Azure resource authentications. To make Data Factory able to navigate and copy all the files or access control lists (ACLs) you will need to grant high enough permissions to the account to access, read, or write all files and set ACLs if you choose to. You should grant the account a super-user or owner role during the migration period and remove the elevated permissions once the migration is completed.
180180

181181

182182
## Next steps
183183

184184
> [!div class="nextstepaction"]
185185
> [Copy activity overview](copy-activity-overview.md)
186186
> [Azure Data Lake Storage Gen1 connector](connector-azure-data-lake-store.md)
187-
> [Azure Data Lake Storage Gen2 connector](connector-azure-data-lake-storage.md)
187+
> [Azure Data Lake Storage Gen2 connector](connector-azure-data-lake-storage.md)

0 commit comments

Comments
 (0)