Skip to content

Commit 67012b7

Browse files
committed
Some checkins
1 parent f7ef9f2 commit 67012b7

File tree

1 file changed

+16
-16
lines changed

1 file changed

+16
-16
lines changed

articles/storage/blobs/data-lake-storage-migrate-gen1-to-gen2.md

Lines changed: 16 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -122,16 +122,16 @@ This is the simplest pattern.
122122

123123
2. Move data from Gen1 to Gen2.
124124

125+
> [!TIP]
126+
> For data transfer, we recommend [Azure Data Factory](https://docs.microsoft.com/azure/data-factory/connector-azure-data-lake-storage). ACLs copy with the data.
127+
125128
3. Point ingest operations and workloads to Gen2.
126129

127130
4. Decommission Gen1.
128131

129132
![lift and shift pattern](./media/data-lake-storage-migrate-gen1-to-gen2/lift-and-shift.png)
130133

131-
> [!TIP]
132-
> For data transfer, we recommend [Azure Data Factory](https://docs.microsoft.com/azure/data-factory/connector-azure-data-lake-storage). ACLs copy with the data.
133-
134-
#### Considerations for using this pattern:
134+
#### Considerations for using the lift and shift pattern
135135

136136
:heavy_check_mark: Cutover from Gen1 to Gen2 for all workloads at the same time.
137137

@@ -143,6 +143,9 @@ This is the simplest pattern.
143143

144144
1. Start moving data from Gen1 to Gen2.
145145

146+
> [!TIP]
147+
> For data transfer, we recommend [Azure Data Factory](https://docs.microsoft.com/azure/data-factory/connector-azure-data-lake-storage). ACLs copy with the data.
148+
146149
2. Incrementally copy new data from Gen1.
147150

148151
3. After all data is copied, stop all writes to Gen1, and point workloads to Gen2.
@@ -151,10 +154,7 @@ This is the simplest pattern.
151154

152155
![Incremental copy pattern](./media/data-lake-storage-migrate-gen1-to-gen2/incremental-copy.png)
153156

154-
> [!TIP]
155-
> For data transfer, we recommend [Azure Data Factory](https://docs.microsoft.com/azure/data-factory/connector-azure-data-lake-storage). ACLs copy with the data.
156-
157-
#### Considerations for using this pattern:
157+
#### Considerations for using the incremental copy pattern:
158158

159159
:heavy_check_mark: Cutover from Gen1 to Gen2 for all workloads at the same time.
160160

@@ -166,6 +166,9 @@ This is the simplest pattern.
166166

167167
1. Move data from Gen1 to Gen2.
168168

169+
> [!TIP]
170+
> For data transfer, we recommend [Azure Data Factory](https://docs.microsoft.com/azure/data-factory/connector-azure-data-lake-storage). ACLs copy with the data.
171+
169172
2. Ingest new data to both Gen1 and Gen2.
170173

171174
3. Point workloads to Gen2.
@@ -174,10 +177,7 @@ This is the simplest pattern.
174177

175178
![Dual pipeline pattern](./media/data-lake-storage-migrate-gen1-to-gen2/dual-pipeline.png)
176179

177-
> [!TIP]
178-
> For data transfer, we recommend [Azure Data Factory](https://docs.microsoft.com/azure/data-factory/connector-azure-data-lake-storage). ACLs copy with the data.
179-
180-
#### Considerations for using this pattern:
180+
#### Considerations for using the dual pipeline pattern:
181181

182182
:heavy_check_mark: Gen1 and Gen2 pipelines run side-by-side.
183183

@@ -191,6 +191,9 @@ This pattern is similar to the *dual pipeline* pattern, but it's more ideally su
191191

192192
1. Set up bidirectional replication between Gen1 and Gen2.
193193

194+
> [!TIP]
195+
> For bidirectional data transfer, we recommend [WanDisco](https://docs.wandisco.com/bigdata/wdfusion/adls/). It offers a repair feature for existing data.
196+
194197
2. Incrementally move ingest and compute workloads to Gen2.
195198

196199
3. When all moves are complete, stop all writes to Gen1 and turn off bidirectional replication.
@@ -199,10 +202,7 @@ This pattern is similar to the *dual pipeline* pattern, but it's more ideally su
199202

200203
![Bidirectional pattern](./media/data-lake-storage-migrate-gen1-to-gen2/bidirectional-sync.png)
201204

202-
> [!TIP]
203-
> For bidirectional data transfer, we recommend [WanDisco](https://docs.wandisco.com/bigdata/wdfusion/adls/). It offers a repair feature for existing data.
204-
205-
#### Considerations for using this pattern:
205+
#### Considerations for using the bi-directional sync pattern:
206206

207207
:heavy_check_mark: Ideal for complex scenarios that involve a large number of pipelines and dependencies where a phased approach might make more sense.
208208

0 commit comments

Comments
 (0)