Skip to content

Commit 5daf2ed

Browse files
alexottmgyucht
andauthored
[Feature] Added databricks_functions data source (#4154)
## Changes <!-- Summary of your changes that are easy to understand --> It's now possible to fetch information about functions defined in a specific UC schema. No integration test yet because we don't have `databricks_function` resource yet. Resolves #4111 ## Tests <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> - [x] `make test` run locally - [x] relevant change in `docs/` folder - [ ] covered with integration tests in `internal/acceptance` - [ ] relevant acceptance tests are passing - [x] using Go SDK --------- Co-authored-by: Miles Yucht <[email protected]>
1 parent 613ed1a commit 5daf2ed

File tree

3 files changed

+178
-0
lines changed

3 files changed

+178
-0
lines changed

docs/data-sources/functions.md

Lines changed: 86 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,86 @@
1+
---
2+
subcategory: "Unity Catalog"
3+
---
4+
# databricks_functionss Data Source
5+
6+
-> This data source can only be used with a workspace-level provider!
7+
8+
Retrieves a list of [User-Defined Functions (UDFs) registered in the Unity Catalog](https://docs.databricks.com/en/udf/unity-catalog.html).
9+
10+
## Example Usage
11+
12+
List all functions defined in a specific schema (`main.default` in this example):
13+
14+
```hcl
15+
data "databricks_functions" "all" {
16+
catalog_name = "main"
17+
schema_name = "default"
18+
}
19+
20+
output "all_external_locations" {
21+
value = data.databricks_functions.all.functions
22+
}
23+
```
24+
25+
## Argument Reference
26+
27+
The following arguments are supported:
28+
29+
* `catalog_name` - (Required) Name of [databricks_catalog](../resources/catalog.md).
30+
* `schema_name` - (Required) Name of [databricks_schema](../resources/schema.md).
31+
* `include_browse` - (Optional, Boolean) flag to specify if include UDFs in the response for which the principal can only access selective metadata for.
32+
33+
## Attribute Reference
34+
35+
This data source exports the following attributes:
36+
37+
* `functions` - list of objects describing individual UDF. Each object consists of the following attributes (refer to [REST API documentation](https://docs.databricks.com/api/workspace/functions/list#functions) for up-to-date list of attributes. Default type is String):
38+
* `name` - Name of function, relative to parent schema.
39+
* `catalog_name` - Name of parent catalog.
40+
* `schema_name` - Name of parent schema relative to its parent catalog.
41+
* `input_params` - object describing input parameters. Consists of the single attribute:
42+
* `parameters` - The array of definitions of the function's parameters:
43+
* `name` - Name of parameter.
44+
* `type_text` - Full data type spec, SQL/catalogString text.
45+
* `type_json` - Full data type spec, JSON-serialized.
46+
* `type_name` - Name of type (INT, STRUCT, MAP, etc.).
47+
* `type_precision` - Digits of precision; required on Create for DecimalTypes.
48+
* `type_scale` - Digits to right of decimal; Required on Create for DecimalTypes.
49+
* `type_interval_type` - Format of IntervalType.
50+
* `position` - Ordinal position of column (starting at position 0).
51+
* `parameter_mode` - The mode of the function parameter.
52+
* `parameter_type` - The type of function parameter (`PARAM` or `COLUMN`).
53+
* `parameter_default` - Default value of the parameter.
54+
* `comment` - User-provided free-form text description.
55+
* `return_params` - Table function return parameters. See `input_params` for description.
56+
* `data_type` - Scalar function return data type.
57+
* `full_data_type` - Pretty printed function data type.
58+
* `routine_body` - Function language (`SQL` or `EXTERNAL`). When `EXTERNAL` is used, the language of the routine function should be specified in the `external_language` field, and the `return_params` of the function cannot be used (as `TABLE` return type is not supported), and the `sql_data_access` field must be `NO_SQL`.
59+
* `routine_definition` - Function body.
60+
* `routine_dependencies` - Function dependencies.
61+
* `parameter_style` - Function parameter style. `S` is the value for SQL.
62+
* `is_deterministic` - Boolean flag specifying whether the function is deterministic.
63+
* `sql_data_access` - Function SQL data access (`CONTAINS_SQL`, `READS_SQL_DATA`, `NO_SQL`).
64+
* `is_null_call` - Boolean flag whether function null call.
65+
* `security_type` - Function security type. (Enum: `DEFINER`).
66+
* `specific_name` - Specific name of the function; Reserved for future use.
67+
* `external_name` - External function name.
68+
* `external_language` - External function language.
69+
* `sql_path` - List of schemes whose objects can be referenced without qualification.
70+
* `owner` - Username of current owner of function.
71+
* `comment` - User-provided free-form text description.
72+
* `properties` - JSON-serialized key-value pair map, encoded (escaped) as a string.
73+
* `metastore_id` - Unique identifier of parent metastore.
74+
* `full_name` - Full name of function, in form of catalog_name.schema_name.function__name
75+
* `created_at` - Time at which this function was created, in epoch milliseconds.
76+
* `created_by` - Username of function creator.
77+
* `updated_at` - Time at which this function was created, in epoch milliseconds.
78+
* `updated_by` - Username of user who last modified function.
79+
* `function_id` - Id of Function, relative to parent schema.
80+
* `browse_only` - Indicates whether the principal is limited to retrieving metadata for the associated object through the `BROWSE` privilege when `include_browse` is enabled in the request.
81+
82+
## Related Resources
83+
84+
The following resources are used in the same context:
85+
86+
* [databricks_schema](./schema.md) to get information about a single schema

internal/providers/pluginfw/pluginfw.go

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,7 @@ import (
1616
"github.com/databricks/terraform-provider-databricks/commands"
1717
"github.com/databricks/terraform-provider-databricks/common"
1818
providercommon "github.com/databricks/terraform-provider-databricks/internal/providers/common"
19+
"github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/resources/catalog"
1920
"github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/resources/cluster"
2021
"github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/resources/library"
2122
"github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/resources/notificationdestinations"
@@ -60,6 +61,7 @@ func (p *DatabricksProviderPluginFramework) DataSources(ctx context.Context) []f
6061
notificationdestinations.DataSourceNotificationDestinations,
6162
sharing.DataSourceShare,
6263
sharing.DataSourceShares,
64+
catalog.DataSourceFunctions,
6365
}
6466
}
6567

Lines changed: 90 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,90 @@
1+
package catalog
2+
3+
import (
4+
"context"
5+
"fmt"
6+
7+
"github.com/databricks/databricks-sdk-go/apierr"
8+
"github.com/databricks/databricks-sdk-go/service/catalog"
9+
"github.com/databricks/terraform-provider-databricks/common"
10+
pluginfwcommon "github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/common"
11+
"github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/converters"
12+
"github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/tfschema"
13+
"github.com/databricks/terraform-provider-databricks/internal/service/catalog_tf"
14+
"github.com/hashicorp/terraform-plugin-framework/datasource"
15+
"github.com/hashicorp/terraform-plugin-framework/datasource/schema"
16+
"github.com/hashicorp/terraform-plugin-framework/types"
17+
)
18+
19+
func DataSourceFunctions() datasource.DataSource {
20+
return &FunctionsDataSource{}
21+
}
22+
23+
var _ datasource.DataSourceWithConfigure = &FunctionsDataSource{}
24+
25+
type FunctionsDataSource struct {
26+
Client *common.DatabricksClient
27+
}
28+
29+
type FunctionsData struct {
30+
CatalogName types.String `tfsdk:"catalog_name"`
31+
SchemaName types.String `tfsdk:"schema_name"`
32+
IncludeBrowse types.Bool `tfsdk:"include_browse" tf:"optional"`
33+
Functions []catalog_tf.FunctionInfo `tfsdk:"functions" tf:"optional,computed"`
34+
}
35+
36+
func (d *FunctionsDataSource) Metadata(ctx context.Context, req datasource.MetadataRequest, resp *datasource.MetadataResponse) {
37+
resp.TypeName = "databricks_functions"
38+
}
39+
40+
func (d *FunctionsDataSource) Schema(ctx context.Context, req datasource.SchemaRequest, resp *datasource.SchemaResponse) {
41+
attrs, blocks := tfschema.DataSourceStructToSchemaMap(FunctionsData{}, nil)
42+
resp.Schema = schema.Schema{
43+
Attributes: attrs,
44+
Blocks: blocks,
45+
}
46+
}
47+
48+
func (d *FunctionsDataSource) Configure(_ context.Context, req datasource.ConfigureRequest, resp *datasource.ConfigureResponse) {
49+
if d.Client == nil {
50+
d.Client = pluginfwcommon.ConfigureDataSource(req, resp)
51+
}
52+
}
53+
54+
func (d *FunctionsDataSource) Read(ctx context.Context, req datasource.ReadRequest, resp *datasource.ReadResponse) {
55+
w, diags := d.Client.GetWorkspaceClient()
56+
resp.Diagnostics.Append(diags...)
57+
if resp.Diagnostics.HasError() {
58+
return
59+
}
60+
61+
var functions FunctionsData
62+
diags = req.Config.Get(ctx, &functions)
63+
resp.Diagnostics.Append(diags...)
64+
if resp.Diagnostics.HasError() {
65+
return
66+
}
67+
catalogName := functions.CatalogName.ValueString()
68+
schemaName := functions.SchemaName.ValueString()
69+
functionsInfosSdk, err := w.Functions.ListAll(ctx, catalog.ListFunctionsRequest{
70+
CatalogName: catalogName,
71+
SchemaName: schemaName,
72+
IncludeBrowse: functions.IncludeBrowse.ValueBool(),
73+
})
74+
if err != nil {
75+
if apierr.IsMissing(err) {
76+
resp.State.RemoveResource(ctx)
77+
}
78+
resp.Diagnostics.AddError(fmt.Sprintf("failed to get functions for %s.%s schema", catalogName, schemaName), err.Error())
79+
return
80+
}
81+
for _, functionSdk := range functionsInfosSdk {
82+
var function catalog_tf.FunctionInfo
83+
resp.Diagnostics.Append(converters.GoSdkToTfSdkStruct(ctx, functionSdk, &function)...)
84+
if resp.Diagnostics.HasError() {
85+
return
86+
}
87+
functions.Functions = append(functions.Functions, function)
88+
}
89+
resp.Diagnostics.Append(resp.State.Set(ctx, functions)...)
90+
}

0 commit comments

Comments
 (0)