Skip to content

Commit c01e56b

Browse files
committed
status
1 parent 4769f05 commit c01e56b

10 files changed

+10
-10
lines changed

articles/data-factory/concepts-data-flow-performance.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ Mapping data flows in Azure Data Factory and Synapse pipelines provide a code-fr
1818

1919
Watch the following video to see shows some sample timings transforming data with data flows.
2020

21-
> [!VIDEO https://www.microsoft.com/en-us/videoplayer/embed/RE4rNxM]
21+
> [!VIDEO https://learn-video.azurefd.net/vod/player?id=0c322fbc-bcd2-4698-b031-4a51b1d9d129]
2222
2323
## Monitoring data flow performance
2424

articles/data-factory/concepts-data-flow-schema-drift.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ You need to make an architectural decision in your data flow to accept schema dr
2929

3030
This video provides an introduction to some of the complex solutions that you can build easily in Azure Data Factory or Synapse Analytics pipelines with data flow's **schema drift** feature. In this example, we build reusable patterns based on flexible database schemas:
3131

32-
> [!VIDEO https://www.microsoft.com/en-us/videoplayer/embed/RE4tyx7]
32+
> [!VIDEO https://learn-video.azurefd.net/vod/player?id=941aff82-3f60-45be-853c-088bff9d703e]
3333
3434
## Schema drift in source
3535

articles/data-factory/concepts-data-flow-udf.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ A user defined function is a customized expression you can define to be able to
2020

2121
Whenever you find yourself building the same logic in an expression across multiple mapping data flows this would be a good opportunity to turn that into a user defined function.
2222

23-
> [!VIDEO https://www.microsoft.com/en-us/videoplayer/embed/RE4Zkek]
23+
> [!VIDEO https://learn-video.azurefd.net/vod/player?id=6ee2ba96-a6ca-4a57-8545-d03032aa68a2]
2424
>
2525
2626
## Getting started

articles/data-factory/continuous-integration-delivery-hotfix-environment.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@ Use the following steps to deploy a hotfix in your production and test environme
4343
## Video tutorial
4444
See the video below an in-depth video tutorial on how to hot-fix your environments.
4545

46-
> [!VIDEO https://www.microsoft.com/videoplayer/embed/RE4I7fi]
46+
> [!VIDEO https://learn-video.azurefd.net/vod/player?id=b0bab151-85b7-4684-a184-ec5e6b972415]
4747
4848
## Related content
4949

articles/data-factory/continuous-integration-delivery.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -83,7 +83,7 @@ If you're using Git integration with your data factory and have a CI/CD pipeline
8383

8484
To learn how to set up a feature flag, see the below video tutorial:
8585

86-
>[!VIDEO https://www.microsoft.com/videoplayer/embed/RE4IxdW]
86+
>[!VIDEO https://learn-video.azurefd.net/vod/player?id=753e946c-f8e0-4a70-b352-2ed1aa296466]
8787
8888
## Unsupported features
8989

articles/data-factory/control-flow-power-query-activity.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ The Power Query activity allows you to build and execute Power Query mash-ups to
1616

1717
You can work directly inside of the Power Query mash-up editor to perform interactive data exploration and then save your work. Once complete, you can take your Power Query activity and add it to a pipeline. Azure Data Factory will automatically scale it out and operationalize your data wrangling using Azure Data Factory's data flow Spark environment.
1818

19-
> [!VIDEO https://www.microsoft.com/en-us/videoplayer/embed/RE4MFYn]
19+
> [!VIDEO https://learn-video.azurefd.net/vod/player?id=a7e9315d-4903-40c1-a759-d6fbd40813de]
2020
2121
## Create a Power Query activity with UI
2222

articles/data-factory/data-flow-alter-row.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ Use the Alter Row transformation to set insert, delete, update, and upsert polic
2323

2424
Alter Row transformations only operate on database, REST, or Azure Cosmos DB sinks in your data flow. The actions that you assign to rows (insert, update, delete, upsert) doesn't occur during debug sessions. To enact the alter row policies on your database tables, run an Execute Data Flow activity in a pipeline.
2525

26-
> [!VIDEO https://www.microsoft.com/en-us/videoplayer/embed/RE4vJYc]
26+
> [!VIDEO https://learn-video.azurefd.net/vod/player?id=6e4cb953-1ae0-4a20-9d7f-42bb85c4245b]
2727
2828
> [!NOTE]
2929
> An Alter Row transformation is not needed for Change Data Capture data flows that use native CDC sources like SQL Server or SAP. In those instances, ADF will automatically detect the row marker so Alter Row policies are unnecessary.

articles/data-factory/data-flow-assert.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ ms.date: 01/05/2024
1616

1717
The Assert transformation enables you to build custom rules inside your mapping data flows for data quality and data validation. You can build rules that determine whether values meet an expected value domain. Additionally, you can build rules that check for row uniqueness. The Assert transformation helps to determine if each row in your data meets a set of criteria. The Assert transformation also allows you to set custom error messages when data validation rules aren't met.
1818

19-
> [!VIDEO https://www.microsoft.com/en-us/videoplayer/embed/RWRdIu]
19+
> [!VIDEO https://learn-video.azurefd.net/vod/player?id=d7dab33e-7008-45f8-b982-1451b157900a]
2020
2121
:::image type="content" source="media/data-flow/data-flow-assert-001.png" alt-text="Assert type":::
2222

articles/data-factory/data-flow-conditional-split.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ ms.date: 01/05/2024
1919

2020
The conditional split transformation routes data rows to different streams based on matching conditions. The conditional split transformation is similar to a CASE decision structure in a programming language. The transformation evaluates expressions, and based on the results, directs the data row to the specified stream.
2121

22-
> [!VIDEO https://www.microsoft.com/en-us/videoplayer/embed/RE4wKCX]
22+
> [!VIDEO https://learn-video.azurefd.net/vod/player?id=232f117d-99a3-4742-8c68-b6dc4d7c6172]
2323
2424
## Configuration
2525

articles/data-factory/data-flow-exists.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ ms.date: 01/05/2024
1919

2020
The exists transformation is a row filtering transformation that checks whether your data exists in another source or stream. The output stream includes all rows in the left stream that either exist or don't exist in the right stream. The exists transformation is similar to ```SQL WHERE EXISTS``` and ```SQL WHERE NOT EXISTS```.
2121

22-
> [!VIDEO https://www.microsoft.com/en-us/videoplayer/embed/RE4vZKz]
22+
> [!VIDEO https://learn-video.azurefd.net/vod/player?id=53f2da73-3587-4d51-9f25-9f89e9446a8c]
2323
2424
## Configuration
2525

0 commit comments

Comments
 (0)