Skip to content

Commit e27756f

Browse files
Merge pull request #230476 from SunilVeldurthi/patch-1
Update tutorial-run-existing-pipeline-with-airflow.md
2 parents 09f3438 + 33cf8dc commit e27756f

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

articles/data-factory/tutorial-run-existing-pipeline-with-airflow.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -89,8 +89,8 @@ Data Factory pipelines provide 100+ data source connectors that provide scalable
8989

9090
You will have to fill in your **client_id**, **client_secret**, **tenant_id**, **subscription_id**, **resource_group_name**, **data_factory_name**, and **pipeline_name**.
9191

92-
1. Upload the **adf.py** file to your blob storage within a folder called **DAG**.
93-
1. [Import the **DAG** folder into your Managed Airflow environment](./how-does-managed-airflow-work.md#import-dags). If you do not have one, [create a new one](./how-does-managed-airflow-work.md#create-a-managed-airflow-environment)
92+
1. Upload the **adf.py** file to your blob storage within a folder called **DAGS**.
93+
1. [Import the **DAGS** folder into your Managed Airflow environment](./how-does-managed-airflow-work.md#import-dags). If you do not have one, [create a new one](./how-does-managed-airflow-work.md#create-a-managed-airflow-environment)
9494

9595
:::image type="content" source="media/tutorial-run-existing-pipeline-with-airflow/airflow-environment.png" alt-text="Screenshot showing the data factory management tab with the Airflow section selected.":::
9696

0 commit comments

Comments
 (0)