Skip to content

Commit a9de33e

Browse files
authored
Merge pull request #102571 from TimShererWithAquent/us1668382a
1668382 Quickstart consistency.
2 parents b8f4ebb + 3dc3d9c commit a9de33e

File tree

1 file changed

+14
-7
lines changed

1 file changed

+14
-7
lines changed

articles/data-factory/quickstart-create-data-factory-python.md

Lines changed: 14 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -15,24 +15,31 @@ ms.date: 01/22/2018
1515
ms.custom: seo-python-october2019
1616
---
1717

18-
# Quickstart: Create an Azure Data Factory and pipeline using Python
18+
# Quickstart: Create a data factory and pipeline using Python
1919

2020
> [!div class="op_single_selector" title1="Select the version of Data Factory service you are using:"]
2121
> * [Version 1](v1/data-factory-copy-data-from-azure-blob-storage-to-sql-database.md)
2222
> * [Current version](quickstart-create-data-factory-python.md)
2323
24-
This quickstart describes how to use Python to create an Azure data factory. The pipeline in this data factory copies data from one folder to another folder in an Azure blob storage.
24+
In this quickstart, you create a data factory by using Python. The pipeline in this data factory copies data from one folder to another folder in Azure Blob storage.
2525

26-
Azure Data Factory is a cloud-based data integration service that allows you to create data-driven workflows in the cloud for orchestrating and automating data movement and data transformation. Using Azure Data Factory, you can create and schedule data-driven workflows (called pipelines) that can ingest data from disparate data stores, process/transform the data by using compute services such as Azure HDInsight Hadoop, Spark, Azure Data Lake Analytics, and Azure Machine Learning, and publish output data to data stores such as Azure SQL Data Warehouse for business intelligence (BI) applications to consume.
26+
Azure Data Factory is a cloud-based data integration service that allows you to create data-driven workflows for orchestrating and automating data movement and data transformation. Using Azure Data Factory, you can create and schedule data-driven workflows, called pipelines.
2727

28-
If you don't have an Azure subscription, create a [free](https://azure.microsoft.com/free/) account before you begin.
28+
Pipelines can ingest data from disparate data stores. Pipelines process or transform data by using compute services such as Azure HDInsight Hadoop, Spark, Azure Data Lake Analytics, and Azure Machine Learning. Pipelines publish output data to data stores such as Azure SQL Data Warehouse for business intelligence (BI) applications.
2929

3030
## Prerequisites
3131

32-
* **Azure Storage account**. You use the blob storage as **source** and **sink** data store. If you don't have an Azure storage account, see the [Create a storage account](../storage/common/storage-account-create.md) article for steps to create one.
33-
* **Create an application in Azure Active Directory** following [this instruction](../active-directory/develop/howto-create-service-principal-portal.md#create-an-azure-active-directory-application). Make note of the following values that you use in later steps: **application ID**, **authentication key**, and **tenant ID**. Assign application to "**Contributor**" role by following instructions in the same article.
32+
* An Azure account with an active subscription. [Create one for free](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio).
3433

35-
### Create and upload an input file
34+
* [Python 3.4+](https://www.python.org/downloads/).
35+
36+
* [An Azure Storage account](../storage/common/storage-account-create.md).
37+
38+
* [Azure Storage Explorer](https://storageexplorer.com/) (optional).
39+
40+
* [An application in Azure Active Directory](../active-directory/develop/howto-create-service-principal-portal.md#create-an-azure-active-directory-application). Make note of the following values to use in later steps: **application ID**, **authentication key**, and **tenant ID**. Assign application to the **Contributor** role by following instructions in the same article.
41+
42+
## Create and upload an input file
3643

3744
1. Launch Notepad. Copy the following text and save it as **input.txt** file on your disk.
3845

0 commit comments

Comments
 (0)