@@ -21,15 +21,22 @@ This Demo launches Bronze and Silver pipelines with following activities:
2121
22222 . Install [ Databricks CLI] ( https://docs.databricks.com/dev-tools/cli/index.html )
2323
24- 3 . ``` commandline
24+ 3 . Install Python package requirements:
25+ ``` commandline
26+ pip install "PyYAML>=6.0" setuptools databricks-sdk
27+ pip install delta-spark==3.0.0 pyspark==3.5.5
28+ ```
29+
30+ 4 . Clone dlt-meta:
31+ ``` commandline
2532 git clone https://github.com/databrickslabs/dlt-meta.git
2633 ```
2734
28- 4 . ```commandline
35+ 5 . ```commandline
2936 cd dlt-meta
3037 ```
3138
32- 5 . Set python environment variable into terminal
39+ 6 . Set python environment variable into terminal
3340 ```commandline
3441 dlt_meta_home=$(pwd)
3542 ```
@@ -38,7 +45,7 @@ This Demo launches Bronze and Silver pipelines with following activities:
3845 export PYTHONPATH=$dlt_meta_home
3946 ```
4047
41- 6 . ```commandline
48+ 7 . ```commandline
4249 python demo/launch_dais_demo.py --uc_catalog_name=<<uc catalog name>> --profile=<<DEFAULT>>
4350 ```
4451 - uc_catalog_name : Unity catalog name
@@ -53,15 +60,21 @@ This demo will launch auto generated tables(100s) inside single bronze and silve
5360
54612. Install [Databricks CLI](https://docs.databricks.com/dev-tools/cli/index.html)
5562
56- 3. ```commandline
63+ 3. Install Python package requirements:
64+ ```commandline
65+ pip install "PyYAML>=6.0" setuptools databricks-sdk
66+ pip install delta-spark==3.0.0 pyspark==3.5.5
67+ ```
68+
69+ 4 . ``` commandline
5770 git clone https://github.com/databrickslabs/dlt-meta.git
5871 ```
5972
60- 4 . ```commandline
73+ 5 . ```commandline
6174 cd dlt-meta
6275 ```
6376
64- 5 . Set python environment variable into terminal
77+ 6 . Set python environment variable into terminal
6578 ```commandline
6679 dlt_meta_home=$(pwd)
6780 ```
@@ -70,7 +83,7 @@ This demo will launch auto generated tables(100s) inside single bronze and silve
7083 export PYTHONPATH=$dlt_meta_home
7184 ```
7285
73- 6 . ```commandline
86+ 7 . ```commandline
7487 python demo/launch_techsummit_demo.py --uc_catalog_name=<<uc catalog name>> --profile=<<DEFAULT>>
7588 ```
7689 - uc_catalog_name : Unity catalog name
@@ -89,15 +102,21 @@ This demo will perform following tasks:
89102
901032. Install [Databricks CLI](https://docs.databricks.com/dev-tools/cli/index.html)
91104
92- 3. ```commandline
105+ 3. Install Python package requirements:
106+ ```commandline
107+ pip install "PyYAML>=6.0" setuptools databricks-sdk
108+ pip install delta-spark==3.0.0 pyspark==3.5.5
109+ ```
110+
111+ 4 . ``` commandline
93112 git clone https://github.com/databrickslabs/dlt-meta.git
94113 ```
95114
96- 4 . ```commandline
115+ 5 . ```commandline
97116 cd dlt-meta
98117 ```
99118
100- 5 . Set python environment variable into terminal
119+ 6 . Set python environment variable into terminal
101120 ```commandline
102121 dlt_meta_home=$(pwd)
103122 ```
@@ -106,7 +125,7 @@ This demo will perform following tasks:
106125 export PYTHONPATH=$dlt_meta_home
107126 ```
108127
109- 6 . ```commandline
128+ 7 . ```commandline
110129 python demo/launch_af_cloudfiles_demo.py --uc_catalog_name=<<uc catalog name>> --source=cloudfiles --profile=<<DEFAULT>>
111130 ```
112131 - uc_catalog_name : Unity Catalog name
@@ -122,14 +141,20 @@ This demo will perform following tasks:
122141
1231422. Install [Databricks CLI](https://docs.databricks.com/dev-tools/cli/index.html)
124143
125- 3. ```commandline
144+ 3. Install Python package requirements:
145+ ```commandline
146+ pip install "PyYAML>=6.0" setuptools databricks-sdk
147+ pip install delta-spark==3.0.0 pyspark==3.5.5
148+ ```
149+
150+ 4 . ``` commandline
126151 git clone https://github.com/databrickslabs/dlt-meta.git
127152 ```
128153
129- 4 . ```commandline
154+ 5 . ```commandline
130155 cd dlt-meta
131156 ```
132- 5 . Set python environment variable into terminal
157+ 6 . Set python environment variable into terminal
133158 ```commandline
134159 dlt_meta_home=$(pwd)
135160 ```
@@ -181,14 +206,20 @@ This demo will perform following tasks:
181206
1822072. Install [Databricks CLI](https://docs.databricks.com/dev-tools/cli/index.html)
183208
184- 3. ```commandline
209+ 3. Install Python package requirements:
210+ ```commandline
211+ pip install "PyYAML>=6.0" setuptools databricks-sdk
212+ pip install delta-spark==3.0.0 pyspark==3.5.5
213+ ```
214+
215+ 4 . ``` commandline
185216 git clone https://github.com/databrickslabs/dlt-meta.git
186217 ```
187218
188- 4 . ```commandline
219+ 5 . ```commandline
189220 cd dlt-meta
190221 ```
191- 5 . Set python environment variable into terminal
222+ 6 . Set python environment variable into terminal
192223 ```commandline
193224 dlt_meta_home=$(pwd)
194225 ```
@@ -198,15 +229,15 @@ This demo will perform following tasks:
198229
1992306. Run the command
200231 ```commandline
201- python demo/launch_silver_fanout_demo.py --source=cloudfiles --uc_catalog_name=<<uc catalog name>> --profile=<<DEFAULT>>
232+ python demo/launch_silver_fanout_demo.py --source=cloudfiles --uc_catalog_name=<<uc catalog name>> --profile=<<DEFAULT>>
202233 ```
203234
204235 - you can provide `--profile=databricks_profile name` in case you already have databricks cli otherwise command prompt will ask host and token.
205236
206- - - 6a . Databricks Workspace URL:
207- - - Enter your workspace URL, with the format https://<instance-name>.cloud.databricks.com. To get your workspace URL, see Workspace instance names, URLs, and IDs.
237+ a . Databricks Workspace URL:
238+ Enter your workspace URL, with the format https://<instance-name>.cloud.databricks.com. To get your workspace URL, see Workspace instance names, URLs, and IDs.
208239
209- - - 6b . Token:
240+ b . Token:
210241 - In your Databricks workspace, click your Databricks username in the top bar, and then select User Settings from the drop down.
211242
212243 - On the Access tokens tab, click Generate new token.
@@ -241,14 +272,20 @@ This demo will perform following tasks:
241272
2422732. Install [Databricks CLI](https://docs.databricks.com/dev-tools/cli/index.html)
243274
244- 3. ```commandline
275+ 3. Install Python package requirements:
276+ ```commandline
277+ pip install "PyYAML>=6.0" setuptools databricks-sdk
278+ pip install delta-spark==3.0.0 pyspark==3.5.5
279+ ```
280+
281+ 4 . ``` commandline
245282 git clone https://github.com/databrickslabs/dlt-meta.git
246283 ```
247284
248- 4 . ```commandline
285+ 5 . ```commandline
249286 cd dlt-meta
250287 ```
251- 5 . Set python environment variable into terminal
288+ 6 . Set python environment variable into terminal
252289 ```commandline
253290 dlt_meta_home=$(pwd)
254291 ```
@@ -276,14 +313,20 @@ This demo will perform following tasks:
276313
2773142. Install [Databricks CLI](https://docs.databricks.com/dev-tools/cli/index.html)
278315
279- 3. ```commandline
316+ 3. Install Python package requirements:
317+ ```commandline
318+ pip install "PyYAML>=6.0" setuptools databricks-sdk
319+ pip install delta-spark==3.0.0 pyspark==3.5.5
320+ ```
321+
322+ 4 . ``` commandline
280323 git clone https://github.com/databrickslabs/dlt-meta.git
281324 ```
282325
283- 4 . ```commandline
326+ 5 . ```commandline
284327 cd dlt-meta
285328 ```
286- 5 . Set python environment variable into terminal
329+ 6 . Set python environment variable into terminal
287330 ```commandline
288331 dlt_meta_home=$(pwd)
289332 ```
@@ -316,32 +359,38 @@ This demo will perform following tasks:
316359
317360## Overview
318361This demo showcases how to use Databricks Asset Bundles (DABs) with DLT-Meta:
319- * This demo will perform following steps
320- * * Create dlt-meta schema's for dataflowspec and bronze/silver layer
321- * * Upload nccessary resources to unity catalog volume
322- * * Create DAB files with catalog, schema, file locations populated
323- * * Deploy DAB to databricks workspace
324- * * Run onboarding usind DAB commands
325- * * Run Bronze/Silver Pipelines using DAB commands
326- * * Demo examples will showcase fan-out pattern in silver layer
327- * * Demo example will show case custom transfomations for bronze/silver layers
328- * * Adding custom columns and metadata to Bronze tables
329- * * Implementing SCD Type 1 to Silver tables
330- * * Applying expectations to filter data in Silver tables
362+ This demo will perform following steps:
363+ - Create dlt-meta schema's for dataflowspec and bronze/silver layer
364+ - Upload nccessary resources to unity catalog volume
365+ - Create DAB files with catalog, schema, file locations populated
366+ - Deploy DAB to databricks workspace
367+ - Run onboarding usind DAB commands
368+ - Run Bronze/Silver Pipelines using DAB commands
369+ - Demo examples will showcase fan-out pattern in silver layer
370+ - Demo example will show case custom transfomations for bronze/silver layers
371+ - Adding custom columns and metadata to Bronze tables
372+ - Implementing SCD Type 1 to Silver tables
373+ - Applying expectations to filter data in Silver tables
331374
332375### Steps:
3333761. Launch Command Prompt
334377
3353782. Install [Databricks CLI](https://docs.databricks.com/dev-tools/cli/index.html)
336379
337- 3. ```commandline
380+ 3. Install Python package requirements:
381+ ```commandline
382+ pip install "PyYAML>=6.0" setuptools databricks-sdk
383+ pip install delta-spark==3.0.0 pyspark==3.5.5
384+ ```
385+
386+ 4 . ``` commandline
338387 git clone https://github.com/databrickslabs/dlt-meta.git
339388 ```
340389
341- 4 . ```commandline
390+ 5 . ```commandline
342391 cd dlt-meta
343392 ```
344- 5 . Set python environment variable into terminal
393+ 6 . Set python environment variable into terminal
345394 ```commandline
346395 dlt_meta_home=$(pwd)
347396 ```
0 commit comments