Skip to content

Conversation

@shreyas-goenka
Copy link
Contributor

@shreyas-goenka shreyas-goenka commented Dec 5, 2024

This example demonstrates how a job can write a file to a Unity Catalog volume.

@@ -0,0 +1,60 @@
{
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should render fine in the Github repo files. For example: https://github.com/databricks/cli/blob/main/internal/testdata/notebooks/py1.ipynb

@pietern pietern changed the title Add example for a job writing to a UC Volume Add example for a job writing to a Unity Catalog olume Dec 6, 2024
@pietern pietern changed the title Add example for a job writing to a Unity Catalog olume Add example for a job writing to a Unity Catalog volume Dec 6, 2024
Copy link
Contributor

@pietern pietern left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

name: my_volume
# We use the ${resources.schemas...} interpolation syntax to force the creation
# of the schema before the volume. Usage of the ${resources.schemas...} syntax
# allows Databricks Asset Bundles to form a dependency graph between resources.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to go into this here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We have had multiple SAs reach out to us and ask how to sequence the resource creation. Given that folks will often use schemas with their volumes it feels relevant to keep this bit here.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks. Linking the warning PR for posterity: databricks/cli#1989

"metadata": {},
"outputs": [],
"source": [
"file_path = dbutils.widgets.get(\"file_path\")\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This doesn't actually work does it? Without a dbutils.widgets.text() and/or widgets section in the ipynb JSON below

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Works fine:

Screenshot 2024-12-09 at 10 18 18 PM
(.venv) ➜  cli git:(detect/schema-dep) databricks fs cat dbfs:/Volumes/main/shreyas_goenka_hello_world/my_volume/hello_world.txt -p dogfood
Hello World!%

@shreyas-goenka shreyas-goenka merged commit 1794a58 into main Dec 20, 2024
@shreyas-goenka shreyas-goenka deleted the volume/eg branch December 20, 2024 06:19
@balluashok
Copy link

Hi @shreyas-goenka and @pietern : Does DAB have a similar capability to the notebook SQL cmd CREAT SCHEAM IF NOT EXIST ? Could you please confirm on this ?, our deployment is failing with the following error

[2025-02-04T11:14:03.488Z] Error: cannot create schema: Schema 'xyz_sv_cd_sch' already exists
[2025-02-04T11:14:03.488Z]
[2025-02-04T11:14:03.488Z] with databricks_schema.cd_schema,
[2025-02-04T11:14:03.488Z] on bundle.tf.json line 143, in resource.databricks_schema.cd_schema:
[2025-02-04T11:14:03.488Z] 143: },
[2025-02-04T11:14:03.488Z]
[2025-02-04T11:14:03.488Z]
[2025-02-04T11:14:03.488Z] Error: cannot create schema: Schema 'xyz_refined' already exists
[2025-02-04T11:14:03.488Z]
[2025-02-04T11:14:03.488Z] with databricks_schema.refined_schema,
[2025-02-04T11:14:03.488Z] on bundle.tf.json line 149, in resource.databricks_schema.refined_schema:
[2025-02-04T11:14:03.488Z] 149: }

@balluashok
Copy link

Application teams has already created these schemas and volumes using a notebook in the Dev environment, but not yet in Test and Prod. so we are trying to observe the behavior in the dev environment using DAB deployment when the Schema and volume already exists. But the deployment failed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants