Skip to content

Commit fda9bbc

Browse files
authored
Update solution-template-databricks-notebook.md
1 parent 724bbb7 commit fda9bbc

File tree

1 file changed

+8
-8
lines changed

1 file changed

+8
-8
lines changed

articles/data-factory/solution-template-databricks-notebook.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -29,11 +29,11 @@ For simplicity, the template in this tutorial doesn't create a scheduled trigger
2929

3030
## Prerequisites
3131

32-
- An **Azure Blob storage account** with a container called `sinkdata` for use as **sink**
32+
- An Azure Blob storage account with a container called `sinkdata` for use as a sink.
3333

34-
Make note of the **storage account name**, **container name**, and **access key**. You'll need these values later in the template.
34+
Make note of the storage account name, container name, and access key. You'll need these values later in the template.
3535

36-
- An **Azure Databricks workspace**
36+
- An Azure Databricks workspace.
3737

3838
## Import a notebook for Transformation
3939

@@ -92,21 +92,21 @@ To import a **Transformation** notebook to your Databricks workspace:
9292

9393
![Connections setting](media/solution-template-Databricks-notebook/connections-preview.png)
9494

95-
- **Source Blob Connection** to access the source data.
95+
- **Source Blob Connection** - to access the source data.
9696

9797
For this exercise, you can use the public blob storage that contains the source files. Reference the following screenshot for the configuration. Use the following **SAS URL** to connect to source storage (read-only access):
9898

9999
`https://storagewithdata.blob.core.windows.net/data?sv=2018-03-28&si=read%20and%20list&sr=c&sig=PuyyS6%2FKdB2JxcZN0kPlmHSBlD8uIKyzhBWmWzznkBw%3D`
100100

101101
![Selections for authentication method and SAS URL](media/solution-template-Databricks-notebook/source-blob-connection.png)
102102

103-
- **Destination Blob Connection** to store the copied data.
103+
- **Destination Blob Connection** - to store the copied data.
104104

105105
In the **New linked service** window, select your sink storage blob.
106106

107107
![Sink storage blob as a new linked service](media/solution-template-Databricks-notebook/destination-blob-connection.png)
108108

109-
- **Azure Databricks** to connect to the Databricks cluster.
109+
- **Azure Databricks** - to connect to the Databricks cluster.
110110

111111
Create a Databricks-linked service by using the access key that you generated previously. You can opt to select an *interactive cluster* if you have one. This example uses the **New job cluster** option.
112112

@@ -124,7 +124,7 @@ In the new pipeline, most settings are configured automatically with default val
124124

125125
![Source dataset value](media/solution-template-Databricks-notebook/validation-settings.png)
126126

127-
1. In the **Copy data** activity **file-to-blob**, check the source and sink tabs. Change settings if necessary.
127+
1. In the **Copy data** activity **file-to-blob**, check the **Source** and **Sink** tabs. Change settings if necessary.
128128

129129
- **Source** tab
130130
![Source tab](media/solution-template-Databricks-notebook/copy-source-settings.png)
@@ -160,7 +160,7 @@ In the new pipeline, most settings are configured automatically with default val
160160

161161
![Selections for linked service and file path for SourceFilesDataset](media/solution-template-Databricks-notebook/source-file-dataset.png)
162162

163-
- **DestinationFilesDataset** to copy the data into the sink destination location. Use the following values:
163+
- **DestinationFilesDataset** - to copy the data into the sink destination location. Use the following values:
164164

165165
- **Linked service** - `sinkBlob_LS`, created in a previous step.
166166

0 commit comments

Comments
 (0)