Skip to content

Commit 229f42c

Browse files
committed
Updated screenshots
1 parent 8581b31 commit 229f42c

9 files changed

+7
-8
lines changed

articles/data-factory/TOC.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -623,7 +623,7 @@
623623
href: solution-template-migration-s3-azure.md
624624
- name: Move files
625625
href: solution-template-move-files.md
626-
- name: ETL with Azure Databricks
626+
- name: Transformation with Azure Databricks
627627
href: solution-template-databricks-notebook.md
628628
- name: Troubleshooting guides
629629
items:
33.8 KB
Loading
8.96 KB
Loading
4.32 KB
Loading
26.1 KB
Loading
27 KB
Loading
25.2 KB
Loading
23.3 KB
Loading

articles/data-factory/solution-template-databricks-notebook.md

Lines changed: 6 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
title: ETL with Azure Databricks
2+
title: Transformation with Azure Databricks
33
description: Learn how to use a solution template to transform data by using a Databricks notebook in Azure Data Factory.
44
services: data-factory
55
ms.author: abnarain
@@ -13,7 +13,7 @@ ms.custom: seo-lt-2019
1313
ms.date: 12/10/2018
1414
---
1515

16-
# ETL with Azure Databricks
16+
# Transformation with Azure Databricks
1717

1818
In this tutorial, you create an end-to-end pipeline containing **Validation**, **Copy**, and **Notebook** activities in Data Factory.
1919

@@ -33,7 +33,7 @@ To keep this template simple, the template doesn't create a scheduled trigger. Y
3333

3434
2. Ensure you have an **Azure Databricks workspace** or create a new one.
3535

36-
3. **Import the notebook for ETL**.
36+
3. **Import the notebook for Transformation**.
3737
1. In your Azure Databricks, reference following screenshots for importing a **Transformation** notebook to the Databricks workspace. It does not have to be in the same location as below, but remember the path that you choose for later.
3838

3939
![2](media/solution-template-Databricks-notebook/Databricks-tutorial-image02.png)
@@ -68,15 +68,15 @@ To keep this template simple, the template doesn't create a scheduled trigger. Y
6868
print e \# Otherwise print the whole stack trace.  
6969
```
7070

71-
5. Generate a **Databricks access token** for Data Factory to access Databricks. **Save the access token** for later use in creating a Databricks linked service, which looks something like 'dapi32db32cbb4w6eee18b7d87e45exxxxxx'
71+
5. Generate a **Databricks access token** for Data Factory to access Databricks. **Save the access token** for later use in creating a Databricks linked service, which looks something like 'dapi32db32cbb4w6eee18b7d87e45exxxxxx'.
7272

7373
![4](media/solution-template-Databricks-notebook/Databricks-tutorial-image04.png)
7474

7575
![5](media/solution-template-Databricks-notebook/Databricks-tutorial-image05.png)
7676

7777
## How to use this template
7878

79-
1. Go to **ETL with Azure Databricks** template. Create new linked services for following connections.
79+
1. Go to **Transformation with Azure Databricks** template. Create new linked services for following connections.
8080

8181
![Connections setting](media/solution-template-Databricks-notebook/connections-preview.png)
8282

@@ -117,8 +117,7 @@ In the new pipeline created, most settings have been configured automatically wi
117117

118118
![14](media/solution-template-Databricks-notebook/Databricks-tutorial-image14.png)
119119

120-
1. A Notebook activity **ETL** is created, and the linked service created in previous step is selected.
121-
120+
1. A Notebook activity **Transformation** is created, and the linked service created in previous step is selected.
122121
![16](media/solution-template-Databricks-notebook/Databricks-tutorial-image16.png)
123122

124123
1. Select **Settings** tab. For *Notebook path*, the template defines a path by default. You may need to browse and select the correct notebook path uploaded in **Prerequisite** 2.

0 commit comments

Comments
 (0)