Skip to content

Commit 2aa6bee

Browse files
committed
Acrolinx improvements
1 parent 53d3a8f commit 2aa6bee

File tree

2 files changed

+8
-8
lines changed

2 files changed

+8
-8
lines changed

articles/data-factory/concept-managed-airflow.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ ms.custom: references_regions
2323
Azure Data Factory offers serverless pipelines for data process orchestration, data movement with 100+ managed connectors, and visual transformations with the mapping data flow.
2424

2525
Azure Data Factory's Managed Airflow service is a simple and efficient way to create and manage [Apache Airflow](https://airflow.apache.org) environments, enabling you to run data pipelines at scale with ease.
26-
[Apache Airflow](https://airflow.apache.org) is an open-source platform used to programmatically create, schedule, and monitor complex data workflows. It allows you to define a set of tasks, called operators, that can be combined into directed acyclic graphs (DAGs) to represent data pipelines. Airflow enables you to execute these DAGs on a schedule or in response to an event, monitor the progress of workflows, and provide visibility into the state of each task. It is widely used in data engineering and data science to orchestrate data pipelines, and is known for its flexibility, extensibility, and ease of use.
26+
[Apache Airflow](https://airflow.apache.org) is an open-source platform used to programmatically create, schedule, and monitor complex data workflows. It allows you to define a set of tasks, called operators, that can be combined into directed acyclic graphs (DAGs) to represent data pipelines. Airflow enables you to execute these DAGs on a schedule or in response to an event, monitor the progress of workflows, and provide visibility into the state of each task. It's widely used in data engineering and data science to orchestrate data pipelines, and is known for its flexibility, extensibility, and ease of use.
2727

2828
:::image type="content" source="media/concept-managed-airflow/data-integration.png" alt-text="Screenshot shows data integration.":::
2929

@@ -80,11 +80,11 @@ You can install any provider package by editing the airflow environment from the
8080

8181
## Limitations
8282

83-
* Managed Airflow in other regions will be available by GA (Tentative GA is Q2 2023 ).
83+
* Managed Airflow in other regions are available by GA.
8484
* Data Sources connecting through airflow should be publicly accessible.
85-
* Blob Storage behind VNet are not supported during the public preview (Tentative GA is Q2 2023
85+
* Blob Storage behind VNet is not supported during the public preview.
8686
* DAGs that are inside a Blob Storage in VNet/behind Firewall is currently not supported.
87-
* Azure Key Vault is not supported in LinkedServices to import dags.(Tentative GA is Q2 2023)
87+
* Azure Key Vault isn't supported in LinkedServices to import dags.
8888
* Airflow supports officially Blob Storage and ADLS with some limitations.
8989

9090
## Next steps

articles/data-factory/tutorial-run-existing-pipeline-with-airflow.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -23,8 +23,8 @@ Data Factory pipelines provide 100+ data source connectors that provide scalable
2323

2424
* **Azure subscription**. If you don't have an Azure subscription, create a [free Azure account](https://azure.microsoft.com/free/) before you begin.
2525
* **Azure storage account**. If you don't have a storage account, see [Create an Azure storage account](../storage/common/storage-account-create.md?tabs=azure-portal) for steps to create one. *Ensure the storage account allows access only from selected networks.*
26-
* **Azure Data Factory pipeline**. You can follow any of the tutorials and create a new data factory pipeline in case you do not already have one, or create one with one click in [Get started and try out your first data factory pipeline](quickstart-get-started.md).
27-
* **Setup a Service Principal**. You will need to [create a new service principal](../active-directory/develop/howto-create-service-principal-portal.md) or use an existing one and grant it permission to run the pipeline (example – contributor role in the data factory where the existing pipelines exist), even if the Managed Airflow environment and the pipelines exist in the same data factory. You will need to get the Service Principal’s Client ID and Client Secret (API Key).
26+
* **Azure Data Factory pipeline**. You can follow any of the tutorials and create a new data factory pipeline in case you don't already have one, or create one with one select in [Get started and try out your first data factory pipeline](quickstart-get-started.md).
27+
* **Setup a Service Principal**. You'll need to [create a new service principal](../active-directory/develop/howto-create-service-principal-portal.md) or use an existing one and grant it permission to run the pipeline (example – contributor role in the data factory where the existing pipelines exist), even if the Managed Airflow environment and the pipelines exist in the same data factory. You'll need to get the Service Principal’s Client ID and Client Secret (API Key).
2828

2929
## Steps
3030

@@ -88,10 +88,10 @@ Data Factory pipelines provide 100+ data source connectors that provide scalable
8888
# run_pipeline2 >> pipeline_run_sensor
8989
```
9090

91-
You will have to create the connection using the Airflow UI (Admin -> Connections -> '+' -> Choose 'Connection type' as 'Azure Data Factory', then fill in your **client_id**, **client_secret**, **tenant_id**, **subscription_id**, **resource_group_name**, **data_factory_name**, and **pipeline_name**.
91+
You'll have to create the connection using the Airflow UI (Admin -> Connections -> '+' -> Choose 'Connection type' as 'Azure Data Factory', then fill in your **client_id**, **client_secret**, **tenant_id**, **subscription_id**, **resource_group_name**, **data_factory_name**, and **pipeline_name**.
9292

9393
1. Upload the **adf.py** file to your blob storage within a folder called **DAGS**.
94-
1. [Import the **DAGS** folder into your Managed Airflow environment](./how-does-managed-airflow-work.md#import-dags). If you do not have one, [create a new one](./how-does-managed-airflow-work.md#create-a-managed-airflow-environment)
94+
1. [Import the **DAGS** folder into your Managed Airflow environment](./how-does-managed-airflow-work.md#import-dags). If you don't have one, [create a new one](./how-does-managed-airflow-work.md#create-a-managed-airflow-environment)
9595

9696
:::image type="content" source="media/tutorial-run-existing-pipeline-with-airflow/airflow-environment.png" alt-text="Screenshot showing the data factory management tab with the Airflow section selected.":::
9797

0 commit comments

Comments
 (0)