Skip to content

Commit 0a0ce39

Browse files
committed
Updates for databricks doc updates
1 parent 058508f commit 0a0ce39

File tree

1 file changed

+7
-5
lines changed

1 file changed

+7
-5
lines changed

articles/data-factory/transform-data-databricks-python.md

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -3,13 +3,15 @@ title: Transform data with Databricks Python
33
titleSuffix: Azure Data Factory & Azure Synapse
44
description: Learn how to process or transform data by running a Databricks Python activity in an Azure Data Factory or Synapse Analytics pipeline.
55
ms.topic: conceptual
6-
ms.date: 10/03/2024
6+
ms.date: 01/15/2025
77
ms.subservice: orchestration
88
author: nabhishek
99
ms.author: abnarain
1010
ms.custom: devx-track-python, synapse
1111
---
12+
1213
# Transform data by running a Python activity in Azure Databricks
14+
1315
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
1416

1517
The Azure Databricks Python Activity in a [pipeline](concepts-pipelines-activities.md) runs a Python file in your Azure Databricks cluster. This article builds on the [data transformation activities](transform-data.md) article, which presents a general overview of data transformation and the supported transformation activities. Azure Databricks is a managed platform for running Apache Spark.
@@ -120,17 +122,17 @@ For more details refer [Databricks documentation](/azure/databricks/dev-tools/ap
120122

121123
### You can use the Workspace UI:
122124

123-
1. [Use the Databricks workspace UI](/azure/databricks/libraries/#create-a-library)
125+
1. [Use the Databricks workspace UI](/azure/databricks/libraries/cluster-libraries#install-a-library-on-a-cluster)
124126

125-
2. To obtain the dbfs path of the library added using UI, you can use [Databricks CLI](/azure/databricks/dev-tools/cli/#install-the-cli).
127+
2. To obtain the dbfs path of the library added using UI, you can use [Databricks CLI](/azure/databricks/dev-tools/cli/fs-commands#list-the-contents-of-a-directory).
126128

127129
Typically the Jar libraries are stored under dbfs:/FileStore/jars while using the UI. You can list all through the CLI: *databricks fs ls dbfs:/FileStore/job-jars*
128130

129131
### Or you can use the Databricks CLI:
130132

131-
1. Follow [Copy the library using Databricks CLI](/azure/databricks/dev-tools/cli/#copy-a-file-to-dbfs)
133+
1. Follow [Copy the library using Databricks CLI](/azure/databricks/dev-tools/cli/fs-commands#copy-a-directory-or-a-file)
132134

133-
2. Use Databricks CLI [(installation steps)](/azure/databricks/dev-tools/cli/#install-the-cli)
135+
2. Use Databricks CLI [(installation steps)](/azure/databricks/dev-tools/cli/commands#compute-commands)
134136

135137
As an example, to copy a JAR to dbfs:
136138
`dbfs cp SparkPi-assembly-0.1.jar dbfs:/docs/sparkpi.jar`

0 commit comments

Comments
 (0)