-
Notifications
You must be signed in to change notification settings - Fork 5.5k
Databricks sql warehouses #18273
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
michelle0927
merged 50 commits into
PipedreamHQ:master
from
Lokeshchand33:databricks-sql-warehouses
Sep 23, 2025
+749
−34
Merged
Databricks sql warehouses #18273
Changes from all commits
Commits
Show all changes
50 commits
Select commit
Hold shift + click to select a range
97b92e1
Added Databricks SQL Warehouses API actions
Lokeshchand33 5a697bd
Update Databricks SQL Warehouse docs URLs
Lokeshchand33 6623be2
Merge branch 'master' into databricks-sql-warehouses
Lokeshchand33 8343661
Merge branch 'master' into databricks-sql-warehouses
Lokeshchand33 9292e93
fix(databricks): bump component versions and apply lint fixes
Lokeshchand33 6a9646c
fix(databricks): addressed requested changes
Lokeshchand33 d66788b
addressed coderabbit review feedback
Lokeshchand33 e120588
resolved the linting issues
Lokeshchand33 5238430
Merge branch 'master' into databricks-sql-warehouses
Lokeshchand33 e742ec2
addressed all test failures
Lokeshchand33 01ed509
addressed coderabbit review feedback
Lokeshchand33 d83d206
Merge branch 'master' into databricks-sql-warehouses
Lokeshchand33 49e997c
resolved the linting issues
Lokeshchand33 0535802
addressed coderabbit review feedback
Lokeshchand33 b98476c
addressed coderabbit review feedback
Lokeshchand33 2153ac3
resolved the linting issues
Lokeshchand33 b04a050
updates
michelle0927 2222816
Add default value for maxNumClusters
vunguyenhung 2aeacf2
create and edit sql warehouses fixes
Lokeshchand33 9bfe023
create and edit sql warehouse fixes
Lokeshchand33 99dfc76
updates
michelle0927 62287c7
Added Vector Search Index API actions
Lokeshchand33 ee33ab4
addressed coderabbit review feedback
Lokeshchand33 25fdea5
Merge branch 'master' into databricks-sql-warehouses
Lokeshchand33 b7e9fd4
Merge branch 'master' into databricks-sql-warehouses
Lokeshchand33 a13c04c
version updated
Lokeshchand33 30e5b6c
resolved the linting issues
Lokeshchand33 dfb8fd9
Merge branch 'PipedreamHQ:master' into databricks-sql-warehouses
Lokeshchand33 47fe441
addressed all test failures
lokesh154 e4a1037
Merge branch 'master' into databricks-sql-warehouses
Lokeshchand33 9bcacc3
addressed coderabbit review feedback
Lokeshchand33 13a1c59
addressed coderabbit review feedback
Lokeshchand33 9ea188a
addressed coderabbit review feedback
Lokeshchand33 ebea510
updated
Lokeshchand33 75214fc
updated
Lokeshchand33 8666764
updates
Lokeshchand33 30b514d
fixed failed test cases
Lokeshchand33 86b4d47
updated
Lokeshchand33 da2fa27
updated
Lokeshchand33 f9aa39e
updated
Lokeshchand33 008e23b
fixed failed test cases
Lokeshchand33 8fb3072
fixed failed test cases
Lokeshchand33 1258a7c
resolved conflict
Lokeshchand33 8562f9f
Merge branch 'master' into databricks-sql-warehouses
Lokeshchand33 427452e
updated
Lokeshchand33 577233e
version updated
Lokeshchand33 8df764b
updated
Lokeshchand33 af3ef3b
Merge branch 'master' into databricks-sql-warehouses
Lokeshchand33 db3b6cd
updated
Lokeshchand33 6b54d05
Merge branch 'master' into databricks-sql-warehouses
michelle0927 File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Some comments aren't visible on the classic Files Changed page.
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
163 changes: 163 additions & 0 deletions
163
components/databricks/actions/create-vector-search-index/create-vector-search-index.mjs
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,163 @@ | ||
| import databricks from "../../databricks.app.mjs"; | ||
| import utils from "../../common/utils.mjs"; | ||
| import { ConfigurationError } from "@pipedream/platform"; | ||
|
|
||
| export default { | ||
| key: "databricks-create-vector-search-index", | ||
| name: "Create Vector Search Index", | ||
| description: | ||
| "Creates a new vector search index in Databricks. [See the documentation](https://docs.databricks.com/api/workspace/vectorsearchindexes/createindex)", | ||
| version: "0.0.1", | ||
| type: "action", | ||
| props: { | ||
| databricks, | ||
| name: { | ||
| type: "string", | ||
| label: "Index Name", | ||
| description: | ||
| "A unique name for the index (e.g., `main_catalog.docs.en_wiki_index`).", | ||
| }, | ||
| endpointName: { | ||
| propDefinition: [ | ||
| databricks, | ||
| "endpointName", | ||
| ], | ||
| }, | ||
| indexType: { | ||
| type: "string", | ||
| label: "Index Type", | ||
| description: "Type of index (`DELTA_SYNC` or `DIRECT_ACCESS`).", | ||
| options: [ | ||
| "DELTA_SYNC", | ||
| "DIRECT_ACCESS", | ||
| ], | ||
| }, | ||
| primaryKey: { | ||
| type: "string", | ||
| label: "Primary Key", | ||
| description: "The primary key column for the index.", | ||
| }, | ||
| sourceTable: { | ||
| type: "string", | ||
| label: "Source Table", | ||
| description: | ||
| "The Delta table backing the index (required for `DELTA_SYNC`).", | ||
| optional: true, | ||
| }, | ||
| columnsToSync: { | ||
| type: "string[]", | ||
| label: "Columns to Sync", | ||
| description: | ||
| "List of columns to sync from the source Delta table. Example: `[\"id\", \"text\"]` (required for `DELTA_SYNC`).", | ||
| optional: true, | ||
| }, | ||
| embeddingSourceColumns: { | ||
| type: "string[]", | ||
| label: "Embedding Source Columns", | ||
| description: | ||
| "List of embedding source column configs. Each entry is a JSON object string like `{ \"embedding_model_endpoint_name\": \"e5-small-v2\", \"name\": \"text\" }`.Provide when Databricks computes embeddings (DELTA_SYNC).", | ||
| optional: true, | ||
| }, | ||
| schemaJson: { | ||
| type: "string", | ||
| label: "Schema JSON", | ||
| description: | ||
| "The schema of the index in JSON format. Example: `{ \"columns\": [{ \"name\": \"id\", \"type\": \"string\" }, { \"name\": \"text_vector\", \"type\": \"array<double>\" }] }`. Required for `DIRECT_ACCESS` indexes.", | ||
| optional: true, | ||
| }, | ||
| pipelineType: { | ||
| type: "string", | ||
| label: "Pipeline Type", | ||
| description: "Pipeline type for syncing (default: TRIGGERED).", | ||
| options: [ | ||
| "TRIGGERED", | ||
| "CONTINUOUS", | ||
| ], | ||
| optional: true, | ||
| default: "TRIGGERED", | ||
| }, | ||
| }, | ||
|
|
||
| async run({ $ }) { | ||
| const payload = { | ||
| name: this.name, | ||
| endpoint_name: this.endpointName, | ||
| index_type: this.indexType, | ||
| primary_key: this.primaryKey, | ||
| }; | ||
|
|
||
| if (this.indexType === "DELTA_SYNC") { | ||
| if (this.schemaJson) { | ||
| throw new ConfigurationError( | ||
| "`Schema JSON` is not allowed when indexType is DELTA_SYNC.", | ||
| ); | ||
| } | ||
| if (!this.sourceTable) { | ||
| throw new ConfigurationError( | ||
| "sourceTable is required when indexType is DELTA_SYNC.", | ||
| ); | ||
| } | ||
|
|
||
| const columnsToSync = Array.isArray(this.columnsToSync) | ||
| ? this.columnsToSync | ||
| : utils.parseObject(this.columnsToSync); | ||
|
|
||
| const embeddingSourceColumns = utils.parseObject(this.embeddingSourceColumns); | ||
| const hasSource = Array.isArray(embeddingSourceColumns) && embeddingSourceColumns.length > 0; | ||
| if (!hasSource) { | ||
| throw new ConfigurationError( | ||
| "embeddingSourceColumns is required when indexType is DELTA_SYNC.", | ||
| ); | ||
| } | ||
|
|
||
| const deltaSpec = { | ||
| source_table: this.sourceTable, | ||
| pipeline_type: this.pipelineType || "TRIGGERED", | ||
| }; | ||
| if (Array.isArray(columnsToSync) && columnsToSync.length > 0) { | ||
| deltaSpec.columns_to_sync = columnsToSync; | ||
| } | ||
| if (hasSource) { | ||
| for (const [ | ||
| i, | ||
| c, | ||
| ] of embeddingSourceColumns.entries()) { | ||
| if (!c?.name || !c?.embedding_model_endpoint_name) { | ||
| throw new ConfigurationError( | ||
| `embeddingSourceColumns[${i}] must include "name" and "embedding_model_endpoint_name"`, | ||
| ); | ||
| } | ||
| } | ||
| deltaSpec.embedding_source_columns = embeddingSourceColumns; | ||
| } | ||
| payload.delta_sync_index_spec = deltaSpec; | ||
| } | ||
|
|
||
| else if (this.indexType === "DIRECT_ACCESS") { | ||
| if (this.sourceTable || this.columnsToSync?.length || this.embeddingSourceColumns?.length) { | ||
| throw new ConfigurationError( | ||
| "`Source Table`,`Embedding Source Columns` and `Columns to Sync` are not allowed when indexType is DIRECT_ACCESS.", | ||
| ); | ||
| } | ||
| if (!this.schemaJson) { | ||
| throw new ConfigurationError( | ||
| "schemaJson is required when indexType is DIRECT_ACCESS.", | ||
| ); | ||
| } | ||
| payload.direct_access_index_spec = { | ||
| schema_json: this.schemaJson, | ||
| }; | ||
| } | ||
|
|
||
| const response = await this.databricks.createVectorSearchIndex({ | ||
| data: payload, | ||
| $, | ||
| }); | ||
|
|
||
| $.export( | ||
| "$summary", | ||
| `Successfully created vector search index: ${response?.name || this.name}`, | ||
| ); | ||
| return response; | ||
| }, | ||
| }; | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
64 changes: 64 additions & 0 deletions
64
...ts/databricks/actions/delete-vector-search-index-data/delete-vector-search-index-data.mjs
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change | ||||
|---|---|---|---|---|---|---|
| @@ -0,0 +1,64 @@ | ||||||
| import databricks from "../../databricks.app.mjs"; | ||||||
| import utils from "../../common/utils.mjs"; | ||||||
|
|
||||||
| export default { | ||||||
| key: "databricks-delete-vector-search-index-data", | ||||||
| name: "Delete Data from Vector Search Index", | ||||||
| description: | ||||||
| "Deletes rows from a Direct Access vector index by primary-key values. [See the documentation](https://docs.databricks.com/api/workspace/vectorsearchindexes/deletedatavectorindex)", | ||||||
| version: "0.0.1", | ||||||
| type: "action", | ||||||
| props: { | ||||||
| databricks, | ||||||
| endpointName: { | ||||||
| propDefinition: [ | ||||||
| databricks, | ||||||
| "endpointName", | ||||||
| ], | ||||||
| }, | ||||||
| indexName: { | ||||||
| propDefinition: [ | ||||||
| databricks, | ||||||
| "indexName", | ||||||
| ({ endpointName }) => ({ | ||||||
| endpointName, | ||||||
| }), | ||||||
| ], | ||||||
| }, | ||||||
| primaryKeys: { | ||||||
| type: "string[]", | ||||||
| label: "Primary Keys", | ||||||
| description: | ||||||
| "Values of the index’s primary key column to delete (e.g. `1`, `2`). These are the values for the column you set as `primary_key` when the index was created.", | ||||||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
|
||||||
| }, | ||||||
| }, | ||||||
| async run({ $ }) { | ||||||
| const parsedKeys = utils.parseObject(this.primaryKeys); | ||||||
|
|
||||||
| const keys = (Array.isArray(parsedKeys) | ||||||
| ? parsedKeys | ||||||
| : [ | ||||||
| parsedKeys, | ||||||
| ]) | ||||||
| .map((s) => String(s).trim()) | ||||||
| .filter(Boolean); | ||||||
|
|
||||||
| if (!keys.length) { | ||||||
| throw new Error("Please provide at least one primary key to delete."); | ||||||
| } | ||||||
|
|
||||||
| const response = await this.databricks.deleteVectorSearchData({ | ||||||
| indexName: this.indexName, | ||||||
| params: { | ||||||
| primary_keys: keys, | ||||||
| }, | ||||||
| $, | ||||||
| }); | ||||||
|
|
||||||
| $.export( | ||||||
| "$summary", | ||||||
| `Requested delete of ${keys.length} row(s) from index "${this.indexName}".`, | ||||||
| ); | ||||||
| return response; | ||||||
| }, | ||||||
| }; | ||||||
36 changes: 36 additions & 0 deletions
36
components/databricks/actions/delete-vector-search-index/delete-vector-search-index.mjs
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,36 @@ | ||
| import databricks from "../../databricks.app.mjs"; | ||
|
|
||
| export default { | ||
| key: "databricks-delete-vector-search-index", | ||
| name: "Delete Vector Search Index", | ||
| description: "Deletes a vector search index in Databricks. [See the documentation](https://docs.databricks.com/api/workspace/vectorsearchindexes/deleteindex)", | ||
| version: "0.0.1", | ||
| type: "action", | ||
| props: { | ||
| databricks, | ||
| endpointName: { | ||
| propDefinition: [ | ||
| databricks, | ||
| "endpointName", | ||
| ], | ||
| }, | ||
| indexName: { | ||
| propDefinition: [ | ||
| databricks, | ||
| "indexName", | ||
| ({ endpointName }) => ({ | ||
| endpointName, | ||
| }), | ||
| ], | ||
| }, | ||
| }, | ||
| async run({ $ }) { | ||
| const response = await this.databricks.deleteVectorSearchIndex({ | ||
| indexName: this.indexName, | ||
| $, | ||
| }); | ||
|
|
||
| $.export("$summary", `Successfully deleted vector search index: ${this.indexName}`); | ||
| return response; | ||
| }, | ||
| }; |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.