Skip to content

Commit 482f9ee

Browse files
committed
Databricks process and screenshot updates
1 parent f23249e commit 482f9ee

7 files changed

+7
-19
lines changed

articles/data-factory/transform-data-using-databricks-notebook.md

Lines changed: 7 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ description: "Learn how you can use the Databricks Notebook Activity in an Azure
44
ms.topic: tutorial
55
ms.author: abnarain
66
author: nabhishek
7-
ms.date: 10/03/2024
7+
ms.date: 01/15/2025
88
ms.subservice: orchestration
99
---
1010

@@ -60,7 +60,7 @@ For an eleven-minute introduction and demonstration of this feature, watch the f
6060

6161
The name of the Azure data factory must be *globally unique*. If you see the following error, change the name of the data factory (For example, use **<yourname>ADFTutorialDataFactory**). For naming rules for Data Factory artifacts, see the [Data Factory - naming rules](./naming-rules.md) article.
6262

63-
:::image type="content" source="./media/doc-common-process/name-not-available-error.png" alt-text="Screenshot showing the Error when a name is not available.":::
63+
:::image type="content" source="./media/doc-common-process/name-not-available-error.png" alt-text="Screenshot showing the Error when a name isn't available.":::
6464

6565
1. For **Version**, select **V2**.
6666

@@ -94,11 +94,11 @@ In this section, you author a Databricks linked service. This linked service con
9494

9595
1. For **Name**, enter ***AzureDatabricks\_LinkedService***.
9696

97-
1. Select the appropriate **Databricks workspace** that you will run your notebook in.
97+
1. Select the appropriate **Databricks workspace** that you'll run your notebook in.
9898

9999
1. For **Select cluster**, select **New job cluster**.
100100

101-
1. For **Databrick Workspace URL**, the information should be auto-populated.
101+
1. For **Databricks Workspace URL**, the information should be autopopulated.
102102

103103
1. For **Authentication type**, if you select **Access Token**, generate it from Azure Databricks workplace. You can find the steps [here](https://docs.databricks.com/administration-guide/access-control/tokens.html). For **Managed service identity** and **User Assigned Managed Identity**, grant **Contributor role** to both identities in Azure Databricks resource's *Access control* menu.
104104

@@ -142,13 +142,7 @@ In this section, you author a Databricks linked service. This linked service con
142142

143143
1. Create a **New Folder** in Workplace and call it as **adftutorial**.
144144

145-
:::image type="content" source="media/transform-data-using-databricks-notebook/databricks-notebook-activity-image13.png" alt-text="Screenshot showing how to create a new folder.":::
146-
147-
1. [Screenshot showing how to create a new notebook.](https://docs.databricks.com/user-guide/notebooks/index.html#creating-a-notebook) (Python), let’s call it **mynotebook** under **adftutorial** Folder, click **Create.**
148-
149-
:::image type="content" source="media/transform-data-using-databricks-notebook/databricks-notebook-activity-image14.png" alt-text="Screenshot showing how to create a new notebook.":::
150-
151-
:::image type="content" source="media/transform-data-using-databricks-notebook/databricks-notebook-activity-image15.png" alt-text="Screenshot showing how to set the properties of the new notebook.":::
145+
1. [Create a new notebook](https://docs.databricks.com/notebooks/notebooks-manage.html#create-a-notebook-in-any-folder), let’s call it **mynotebook**. Right-click the **adftutorial** Folder, and select **Create.**
152146

153147
1. In the newly created notebook "mynotebook'" add the following code:
154148

@@ -161,8 +155,6 @@ In this section, you author a Databricks linked service. This linked service con
161155
print (y)
162156
```
163157
164-
:::image type="content" source="media/transform-data-using-databricks-notebook/databricks-notebook-activity-image16.png" alt-text="Screenshot showing how to create widgets for parameters.":::
165-
166158
1. The **Notebook Path** in this case is **/adftutorial/mynotebook**.
167159
168160
1. Switch back to the **Data Factory UI authoring tool**. Navigate to **Settings** Tab under the **Notebook1** activity.
@@ -207,13 +199,9 @@ The **Pipeline run** dialog box asks for the **name** parameter. Use **/path/fil
207199
208200
## Verify the output
209201
210-
You can log on to the **Azure Databricks workspace**, go to **Clusters** and you can see the **Job** status as *pending execution, running, or terminated*.
211-
212-
:::image type="content" source="media/transform-data-using-databricks-notebook/databricks-notebook-activity-image24.png" alt-text="Screenshot showing how to view the job cluster and the job.":::
213-
214-
You can click on the **Job name** and navigate to see further details. On successful run, you can validate the parameters passed and the output of the Python notebook.
202+
You can log on to the **Azure Databricks workspace**, go to **Job Runs** and you can see the **Job** status as *pending execution, running, or terminated*.
215203
216-
:::image type="content" source="media/transform-data-using-databricks-notebook/databricks-output.png" alt-text="Screenshot showing how to view the run details and output.":::
204+
You can select the **Job name** and navigate to see further details. On successful run, you can validate the parameters passed and the output of the Python notebook.
217205
218206
## Related content
219207

0 commit comments

Comments
 (0)