Skip to content

Commit 7b436e0

Browse files
committed
Update trigger verbiage
1 parent 961d510 commit 7b436e0

6 files changed

+7
-6
lines changed

articles/data-factory/pricing-examples-copy-transform-azure-databricks.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ To accomplish the scenario, you need to create a pipeline with the following ite
2626

2727
1. One copy activity with an input dataset for the data to be copied from AWS S3, and an output dataset for the data on Azure storage.
2828
2. One Azure Databricks activity for the data transformation.
29-
3. One schedule trigger to execute the pipeline every hour.
29+
3. One schedule trigger to execute the pipeline every hour. When you want to run a pipeline, you can either [trigger it immediately or schedule it](concepts-pipeline-execution-triggers.md). In addition to the pipeline itself, each trigger instance counts as a single Activity run.
3030

3131
:::image type="content" source="media/pricing-concepts/scenario2.png" alt-text="Diagram shows a pipeline with a schedule trigger. In the pipeline, copy activity flows to an input dataset, an output dataset, and a DataBricks activity, which runs on Azure Databricks. The input dataset flows to an A W S S3 linked service. The output dataset flows to an Azure Storage linked service.":::
3232

articles/data-factory/pricing-examples-copy-transform-dynamic-parameters.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ To accomplish the scenario, you need to create a pipeline with the following ite
2727
1. One copy activity with an input dataset for the data to be copied from AWS S3, an output dataset for the data on Azure storage.
2828
2. One Lookup activity for passing parameters dynamically to the transformation script.
2929
3. One Azure Databricks activity for the data transformation.
30-
4. One schedule trigger to execute the pipeline every hour.
30+
4. One schedule trigger to execute the pipeline every hour. When you want to run a pipeline, you can either [trigger it immediately or schedule it](concepts-pipeline-execution-triggers.md). In addition to the pipeline itself, each trigger instance counts as a single Activity run.
3131

3232
:::image type="content" source="media/pricing-concepts/scenario3.png" alt-text="Diagram shows a pipeline with a schedule trigger. In the pipeline, copy activity flows to an input dataset, an output dataset, and lookup activity that flows to a DataBricks activity, which runs on Azure Databricks. The input dataset flows to an A W S S3 linked service. The output dataset flows to an Azure Storage linked service.":::
3333

articles/data-factory/pricing-examples-data-integration-managed-vnet.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ To accomplish the scenario, you need to create two pipelines with the following
2727
- A pipeline activity – Delete Activity.
2828
- A copy activity with an input dataset for the data to be copied from Azure Blob storage.
2929
- An output dataset for the data on Azure SQL Database.
30-
- A schedule triggers to execute the pipeline.
30+
- A schedule trigger to execute the pipeline. When you want to run a pipeline, you can either [trigger it immediately or schedule it](concepts-pipeline-execution-triggers.md). In addition to the pipeline itself, each trigger instance counts as a single Activity run.
3131

3232
## Costs estimation
3333

articles/data-factory/pricing-examples-get-delta-data-from-sap-ecc.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,8 @@ Refer to the [Azure Pricing Calculator](https://azure.microsoft.com/pricing/calc
2525
To accomplish the scenario, you need to create a pipeline with the following items:
2626

2727
- One Mapping Data Flow activity with an input dataset for the data to be loaded from SAP ECC, the transformation logic, and an output dataset for the data on Azure Data Lake Gen2 storage.
28-
- A Self-Hosted Integration Runtime referenced to SAP CDC connector.
28+
- A Self-Hosted Integration Runtime referenced to SAP CDC connector.
29+
- A schedule trigger to execute the pipeline. When you want to run a pipeline, you can either [trigger it immediately or schedule it](concepts-pipeline-execution-triggers.md). In addition to the pipeline itself, each trigger instance counts as a single Activity run.
2930

3031
## Costs estimation
3132

articles/data-factory/pricing-examples-s3-to-blob.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ To accomplish the scenario, you need to create a pipeline with the following ite
2626

2727
1. I'll copy data from AWS S3 to Azure Blob storage, and this will move 10 GB of data from S3 to blob storage. I estimate it will run for 2-3 hours, and I plan to set DIU as Auto.
2828

29-
3. A schedule trigger to execute the pipeline every hour for 8 hours every day.
29+
3. A schedule trigger to execute the pipeline every hour for 8 hours every day. When you want to run a pipeline, you can either [trigger it immediately or schedule it](concepts-pipeline-execution-triggers.md). In addition to the pipeline itself, each trigger instance counts as a single Activity run.
3030

3131
:::image type="content" source="media/pricing-concepts/scenario1.png" alt-text="Diagram shows a pipeline with a schedule trigger.":::
3232

articles/data-factory/pricing-examples-transform-mapping-data-flows.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ To accomplish the scenario, you need to create a pipeline with the following ite
2727
1. A Data Flow activity with the transformation logic.
2828
1. An input dataset for the data on Azure Storage.
2929
1. An output dataset for the data on Azure Storage.
30-
1. A schedule trigger to execute the pipeline every hour.
30+
1. A schedule trigger to execute the pipeline every hour. When you want to run a pipeline, you can either [trigger it immediately or schedule it](concepts-pipeline-execution-triggers.md). In addition to the pipeline itself, each trigger instance counts as a single Activity run.
3131

3232
## Costs estimation
3333

0 commit comments

Comments
 (0)