Skip to content

Commit 1b49ee2

Browse files
committed
PR review fixes
1 parent 47947f8 commit 1b49ee2

File tree

9 files changed

+169
-167
lines changed

9 files changed

+169
-167
lines changed

articles/data-factory/airflow-pricing.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -17,9 +17,9 @@ This article describes the pricing for Managed Airflow usage within data factory
1717

1818
## Pricing details
1919

20-
Managed Airflow supports either small (D2v4) or large (D4v4) node sizing. Small can support up to 50 DAGs simultaneously, and large can support up to 1000 DAGs.The following table describes pricing for each option:
20+
Managed Airflow supports either small (D2v4) or large (D4v4) node sizing. Small can support up to 50 DAGs simultaneously, and large can support up to 1000 DAGs. The following table describes pricing for each option:
2121

22-
:::image type="content" source="media/airflow-pricing/airflow-pricing.png" alt-text="Image showing a table of pricing options for Managed Airflow configuration.":::
22+
:::image type="content" source="media/airflow-pricing/airflow-pricing.png" alt-text="Shows a screenshot of a table of pricing options for Managed Airflow configuration.":::
2323

2424
## Next steps
2525

articles/data-factory/concept-managed-airflow.md

Lines changed: 14 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,7 @@ ms.topic: conceptual
77
author: nabhishek
88
ms.author: abnarain
99
ms.date: 01/20/2023
10+
ms.custom: references_regions
1011
---
1112

1213
# What is Azure Data Factory Managed Airflow?
@@ -27,31 +28,24 @@ Managed Airflow in Azure Data Factory is a managed orchestration service for [
2728

2829
## When to use Managed Airflow?
2930

30-
Azure Data Factory offers [Pipelines](concepts-pipelines-activities.md) to visually orchestrate data processes (UI-based authoring). While Managed Airflow, offers Airflow based python DAGs (python code-centric authoring) for defining the data orchestration process. If you have the Airflow background, or are currently using Apace Airflow, you may prefer to use the Managed Airflow instead of the pipelines. On the contrary, if you would not like to write/ manage python-based DAGs for data process orchestration, you may prefer to use pipelines.
31+
Azure Data Factory offers [Pipelines](concepts-pipelines-activities.md) to visually orchestrate data processes (UI-based authoring). While Managed Airflow, offers Airflow based python DAGs (python code-centric authoring) for defining the data orchestration process. If you have the Airflow background, or are currently using Apace Airflow, you may prefer to use the Managed Airflow instead of the pipelines. On the contrary, if you wouldn't like to write/ manage python-based DAGs for data process orchestration, you may prefer to use pipelines.
3132

3233
With Managed Airflow, Azure Data Factory now offers multi-orchestration capabilities spanning across visual, code-centric, OSS orchestration requirements.
3334

3435
## Features
3536

36-
* **Automatic Airflow setup** – Quickly set up Apache Airflow by choosing an [Apache Airflow version](concept-managed-airflow.md#supported-apache-airflow-versions) when you create a Managed Airflow environment. ADF Managed Airflow sets up Apache Airflow for you using the same Apache Airflow user interface and open-source code you can download on the Internet.
37-
38-
* **Automatic scaling** – Automatically scale Apache Airflow Workers by setting the minimum and maximum number of Workers that run in your environment. ADF Managed Airflow monitors the Workers in your environment. It uses its autoscaling component to add Workers to meet demand until it reaches the maximum number of Workers you defined.
39-
40-
* **Built-in authentication** – Enable Azure Active Directory (Azure AD) role-based authentication and authorization for your Airflow Web server by defining AAD RBAC's access control policies.
41-
42-
* **Built-in security** – Metadata is also automatically encrypted by Azure-managed keys, so your environment is secure by default. Additionally, it supports double encryption with a Customer-Managed Key (CMK).
43-
44-
* **Streamlined upgrades and patches** – Azure Data Factory Managed Airflow provide new versions of Apache Airflow periodically. The ADF Managed Airflow team will auto-update and patch the minor versions.
45-
46-
* **Workflow monitoring** – View Airflow logs and Airflow metrics in Azure Monitor to identify Airflow task delays or workflow errors without needing additional third-party tools. Managed Airflow automatically sends environment metrics, and if enabled, Airflow logs to Azure Monitor.
47-
48-
* **Azure integration** – Azure Data Factory Managed Airflow supports open-source integrations with Azure Data Factory pipelines, Azure Batch, Azure CosmosDB, Azure Key Vault, ACI, ADLS Gen2, Azure Kusto, as well as hundreds of built-in and community-created operators and sensors.
49-
50-
## Architecture (Image to be updated)
37+
- **Automatic Airflow setup** – Quickly set up Apache Airflow by choosing an [Apache Airflow version](concept-managed-airflow.md#supported-apache-airflow-versions) when you create a Managed Airflow environment. ADF Managed Airflow sets up Apache Airflow for you using the same Apache Airflow user interface and open-source code you can download on the Internet.
38+
- **Automatic scaling** – Automatically scale Apache Airflow Workers by setting the minimum and maximum number of Workers that run in your environment. ADF Managed Airflow monitors the Workers in your environment. It uses its autoscaling component to add Workers to meet demand until it reaches the maximum number of Workers you defined.
39+
- **Built-in authentication** – Enable Azure Active Directory (Azure AD) role-based authentication and authorization for your Airflow Web server by defining Azure AD RBAC's access control policies.
40+
- **Built-in security** – Metadata is also automatically encrypted by Azure-managed keys, so your environment is secure by default. Additionally, it supports double encryption with a Customer-Managed Key (CMK).
41+
- **Streamlined upgrades and patches** – Azure Data Factory Managed Airflow provide new versions of Apache Airflow periodically. The ADF Managed Airflow team will auto-update and patch the minor versions.
42+
- **Workflow monitoring** – View Airflow logs and Airflow metrics in Azure Monitor to identify Airflow task delays or workflow errors without needing additional third-party tools. Managed Airflow automatically sends environment metrics, and if enabled, Airflow logs to Azure Monitor.
43+
- **Azure integration** – Azure Data Factory Managed Airflow supports open-source integrations with Azure Data Factory pipelines, Azure Batch, Azure Cosmos DB, Azure Key Vault, ACI, ADLS Gen2, Azure Kusto, as well as hundreds of built-in and community-created operators and sensors.
5144

45+
## Architecture
5246
:::image type="content" source="media/concept-managed-airflow/architecture.png" alt-text="Screenshot shows architecture in Managed Airflow.":::
5347

54-
## Region availability (Public preview)
48+
## Region availability (public preview)
5549

5650
* EastUs
5751
* SouthCentralUs
@@ -79,11 +73,11 @@ Apache Airflow integrates with Microsoft Azure services through microsoft.azure
7973

8074
You can install any provider package by editing the airflow environment from the Azure Data Factory UI. It takes around a couple of minutes to install the package.
8175

82-
:::image type="content" source="media/concept-managed-airflow/airflow-integration.png" alt-text="Screenshot shows airflow integration.":::
76+
:::image type="content" source="media/concept-managed-airflow/airflow-integration.png" lightbox="media/concept-managed-airflow/airflow-integration.png" alt-text="Screenshot shows airflow integration.":::
8377

84-
## Next Steps
78+
## Next steps
8579

8680
- [Run an existing pipeline with Managed Airflow](tutorial-run-existing-pipeline-with-airflow.md)
8781
- [Refresh a Power BI dataset with Managed Airflow](tutorial-refresh-power-bi-dataset-with-airflow.md)
8882
- [Managed Airflow pricing](airflow-pricing.md)
89-
- [How to change the password for Managed Airflow environments](password-change-airflow.md)
83+
- [How to change the password for Managed Airflow environments](password-change-airflow.md)

articles/data-factory/how-does-managed-airflow-work.md

Lines changed: 50 additions & 44 deletions
Original file line numberDiff line numberDiff line change
@@ -19,68 +19,75 @@ ms.date: 01/20/2023
1919
Azure Data Factory Managed Airflow orchestrates your workflows using Directed Acyclic Graphs (DAGs) written in Python. You must provide your DAGs and plugins in Azure Blob Storage. Airflow requirements or library dependencies can be installed during the creation of the new Managed Airflow environment or by editing an existing Managed Airflow environment. Then run and monitor your DAGs by launching the Airflow UI from ADF using a command line interface (CLI) or a software development kit (SDK).
2020

2121
## Create a Managed Airflow environment
22+
The following steps setup and configure your Managed Airflow environment.
2223

23-
* **Prerequisite**
24-
* **Azure subscription**: If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/) before you begin.
25-
* Create or select an existing Data Factory in the region where the managed airflow preview is supported. Supported regions
24+
### Prerequisites
25+
**Azure subscription**: If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/) before you begin.
26+
Create or select an existing Data Factory in the region where the managed airflow preview is supported.
2627

27-
* Create new Managed Airflow environment.
28-
Go to ‘Manage’ hub -> ‘Airflow (Preview)’ -> ‘+New’ to create a new Airflow environment
28+
### Steps to create the environment
29+
1. Create new Managed Airflow environment.
30+
Go to **Manage** hub -> **Airflow (Preview)** -> **+New** to create a new Airflow environment
2931

30-
:::image type="content" source="media/how-does-managed-airflow-work/create-new-airflow.png" alt-text="Screenshot that shows that how to create a new Managed Apache Airflow environment.":::
32+
:::image type="content" source="media/how-does-managed-airflow-work/create-new-airflow.png" alt-text="Screenshot that shows how to create a new Managed Apache Airflow environment.":::
3133

32-
* Provide the details (Airflow config.)
34+
1. Provide the details (Airflow config)
3335

3436
:::image type="content" source="media/how-does-managed-airflow-work/airflow-environment-details.png" alt-text="Screenshot that shows some Managed Airflow environment details.":::
3537

36-
Important:<br>
37-
1. When using Basic authentication, remember the username and password specified in this screen. It will be needed to login later in the Managed Airflow UI. The default option is AAD and it does not require creating username/ password for your Airflow environment, but instead uses the logged in users credential to Azure Data Factory to login/ monitor DAGs.<br>
38-
2.Environment variables a simple key value store within Airflow to store and retrieve arbitrary content or settings.<br>
39-
3.Requirements can be used to pre-install python libraries. You can update these later as well.
38+
> [!IMPORTANT]
39+
> When using **Basic** authentication, remember the username and password specified in this screen. It will be needed to login later in the Managed Airflow UI. The default option is **AAD** and it does not require creating username/ password for your Airflow environment, but instead uses the logged in user**s credential to Azure Data Factory to login/ monitor DAGs.
40+
1. **Environment variables** a simple key value store within Airflow to store and retrieve arbitrary content or settings.
41+
1. **Requirements** can be used to pre-install python libraries. You can update these later as well.
4042

4143
## Import DAGs
4244

43-
* Prerequisite
45+
The following steps describe how to import DAGs into Managed Airflow.
46+
47+
### Prerequisite
48+
49+
You will need to upload a sample DAG onto an accessible Storage account.
4450

45-
* You will need to upload a sample DAG onto an accessible Storage account.
4651
> [!NOTE]
4752
> Blob Storage behind VNet are not supported during the preview. We will be adding the support shortly.
4853
49-
[Sample Apache Airflow v2.x DAG](https://airflow.apache.org/docs/apache-airflow/stable/tutorial/fundamentals.html).<br>
50-
[Sample Apache Airflow v1.10 DAG](https://airflow.apache.org/docs/apache-airflow/1.10.11/_modules/airflow/example_dags/tutorial.html).
54+
[Sample Apache Airflow v2.x DAG](https://airflow.apache.org/docs/apache-airflow/stable/tutorial/fundamentals.html).
55+
[Sample Apache Airflow v1.10 DAG](https://airflow.apache.org/docs/apache-airflow/1.10.11/_modules/airflow/example_dags/tutorial.html).
5156

52-
Copy-paste the content (either v2.x or v1.10 based on the Airflow environment that you have setup) into a new file called as tutorial.py’.<br>
57+
1. Copy-paste the content (either v2.x or v1.10 based on the Airflow environment that you have setup) into a new file called as **tutorial.py**.
5358

54-
Upload the ‘tutorial.py’ to a blob storage. ([How to upload a file into blob](/storage/blobs/storage-quickstart-blobs-portal.md))
55-
> [!NOTE]
56-
>You will need to select a directory path from a blob storage account that contains folders named 'dags' and 'plugins' to import those into the Airflow environment. ‘Plugins’ are not mandatory. You can also have a container named ‘dags’ and upload all Airflow files within it.
59+
Upload the **tutorial.py** to a blob storage. ([How to upload a file into blob](/storage/blobs/storage-quickstart-blobs-portal.md))
5760

58-
* Click on ‘Airflow (Preview)’ under ‘Manage’ hub. Then hover over the earlier created ‘Airflow’ environment and click on ‘Import files’ to Import all DAGs and dependencies into the Airflow Environment.
61+
> [!NOTE]
62+
> You will need to select a directory path from a blob storage account that contains folders named **dags** and **plugins** to import those into the Airflow environment. **Plugins** are not mandatory. You can also have a container named **dags** and upload all Airflow files within it.
5963
60-
:::image type="content" source="media/how-does-managed-airflow-work/import-files.png" alt-text="Screenshot shows import files in manage hub.":::
64+
1. Click on **Airflow (Preview)** under **Manage** hub. Then hover over the earlier created **Airflow** environment and click on **Import files** to Import all DAGs and dependencies into the Airflow Environment.
6165

62-
* Create a new Linked Service to the accessible storage account mentioned in the prerequisite (or use an existing one if you already have your own DAGs).
66+
:::image type="content" source="media/how-does-managed-airflow-work/import-files.png" alt-text="Screenshot shows import files in manage hub.":::
6367

64-
:::image type="content" source="media/how-does-managed-airflow-work/create-new-linkservice.png" alt-text="Screenshot shows that how to create a new linked service.":::
68+
1. Create a new Linked Service to the accessible storage account mentioned in the prerequisite (or use an existing one if you already have your own DAGs).
6569

66-
* Use the storage account where you uploaded the DAG (check prerequisite). Test connection, then click ‘Create’.
70+
:::image type="content" source="media/how-does-managed-airflow-work/create-new-linked-service.png" alt-text="Screenshot that shows how to create a new linked service.":::
6771

68-
:::image type="content" source="media/how-does-managed-airflow-work/linkservice-details.png" alt-text="Screenshot shows some linked service details.":::
72+
1. Use the storage account where you uploaded the DAG (check prerequisite). Test connection, then click **Create**.
6973

70-
* Browse and select ‘airflow’ if using the sample SAS URL or select the folder that contains ‘dags’ folder with DAG files.
71-
> [!NOTE]
72-
> You can import DAGs and their dependencies through this interface. You will need to select a directory path from a blob storage account that contains folders named 'dags' and 'plugins' to import those into the Airflow environment. ‘Plugins’ are not mandatory.
74+
:::image type="content" source="media/how-does-managed-airflow-work/linked-service-details.png" alt-text="Screenshot shows some linked service details.":::
7375

74-
:::image type="content" source="media/how-does-managed-airflow-work/browse-storage.png" alt-text="Screenshot shows browse storage in import files.":::
76+
1. Browse and select **airflow** if using the sample SAS URL or select the folder that contains **dags** folder with DAG files.
7577

76-
:::image type="content" source="media/how-does-managed-airflow-work/browse.png" alt-text="Screenshot shows browse in airflow":::
78+
> [!NOTE]
79+
> You can import DAGs and their dependencies through this interface. You will need to select a directory path from a blob storage account that contains folders named **dags** and **plugins** to import those into the Airflow environment. **Plugins** are not mandatory.
7780
78-
:::image type="content" source="media/how-does-managed-airflow-work/import-in-import-files.png" alt-text="Screenshot shows import in import files.":::
81+
:::image type="content" source="media/how-does-managed-airflow-work/browse-storage.png" alt-text="Screenshot shows browse storage in import files.":::
7982

80-
:::image type="content" source="media/how-does-managed-airflow-work/import-dags.png" alt-text="Screenshot shows import dags.":::
83+
:::image type="content" source="media/how-does-managed-airflow-work/browse.png" alt-text="Screenshot that shows browse in airflow.":::
84+
85+
:::image type="content" source="media/how-does-managed-airflow-work/import-in-import-files.png" alt-text="Screenshot shows import in import files.":::
86+
87+
:::image type="content" source="media/how-does-managed-airflow-work/import-dags.png" alt-text="Screenshot shows import dags.":::
8188

8289
> [!NOTE]
83-
> Importing DAGs could take a couple of minutes during Preview. The notification center (bell icon in ADF UI) can be used to track the import status updates.
90+
> Importing DAGs could take a couple of minutes during **Preview**. The notification center (bell icon in ADF UI) can be used to track the import status updates.
8491
8592
## Troubleshooting import DAG issues
8693

@@ -97,28 +104,27 @@ Mitigation: Login into the Airflow UI and see if there are any DAG parsing error
97104

98105
To monitor the Airflow DAGs, login into Airflow UI with the earlier created username and password.
99106

100-
* Click on the Airflow environment created.
107+
1. Click on the Airflow environment created.
101108

102-
:::image type="content" source="media/how-does-managed-airflow-work/airflow-environment-monitor-dag.png" alt-text="Screenshot shows that click on the Airflow environment created":::
109+
:::image type="content" source="media/how-does-managed-airflow-work/airflow-environment-monitor-dag.png" alt-text="Screenshot that shows the Airflow environment created.":::
103110

104-
* Login using the username-password provided during the Airflow Integration Runtime creation. ([You can reset the username or password by editing the Airflow Integration runtime]() if needed)
111+
1. Login using the username-password provided during the Airflow Integration Runtime creation. ([You can reset the username or password by editing the Airflow Integration runtime]() if needed)
105112

106-
:::image type="content" source="media/how-does-managed-airflow-work/login-in-dags.png" alt-text="Screenshot shows that login using the username-password provided during the Airflow Integration Runtime creation.":::
113+
:::image type="content" source="media/how-does-managed-airflow-work/login-in-dags.png" alt-text="Screenshot that shows login using the username-password provided during the Airflow Integration Runtime creation.":::
107114

108115
## Remove DAGs from the Airflow environment
109116

110-
* If you are using Airflow version 2.x,
111-
112117
If you are using Airflow version 1.x, delete DAGs that are deployed on any Airflow environment (IR), you need to delete the DAGs in two different places.
118+
113119
1. Delete the DAG from Airflow UI
114120
1. Delete the DAG in ADF UI
115121

116122
> [!NOTE]
117123
> This is the current experience during the Public Preview, and we will be improving this experience. 
118124
119-
## Next Steps
125+
## Next steps
120126

121-
- [Run an existing pipeline with Managed Airflow](tutorial-run-existing-pipeline-with-airflow.md)
122-
- [Refresh a Power BI dataset with Managed Airflow](tutorial-refresh-power-bi-dataset-with-airflow.md)
123-
- [Managed Airflow pricing](airflow-pricing.md)
124-
- [How to change the password for Managed Airflow environments](password-change-airflow.md)
127+
* [Run an existing pipeline with Managed Airflow](tutorial-run-existing-pipeline-with-airflow.md)
128+
* [Refresh a Power BI dataset with Managed Airflow](tutorial-refresh-power-bi-dataset-with-airflow.md)
129+
* [Managed Airflow pricing](airflow-pricing.md)
130+
* [How to change the password for Managed Airflow environments](password-change-airflow.md)
99.4 KB
Loading

0 commit comments

Comments
 (0)