Skip to content

Commit ddd28bb

Browse files
authored
Added documentation for MLflow Registry Webhooks resource (#1086)
1 parent 49a66b2 commit ddd28bb

File tree

5 files changed

+120
-3
lines changed

5 files changed

+120
-3
lines changed

CHANGELOG.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@
66
* Fixed issue arises when destroying `databricks_sql_global_config` with instance profile set ([#1076](https://github.com/databrickslabs/terraform-provider-databricks/issues/1076)).
77
* Added setting of SQL configuration parameters in `databricks_sql_global_config` ([#1080](https://github.com/databrickslabs/terraform-provider-databricks/pull/1080)).
88
* Added support for release channels in `databricks_sql_endpoint` configuration ([#1078](https://github.com/databrickslabs/terraform-provider-databricks/pull/1078)).
9+
* Added documentation for `databricks_mlflow_webhook` resource ([#1086](https://github.com/databrickslabs/terraform-provider-databricks/pull/1086)).
910

1011
Updated dependency versions:
1112

docs/guides/aws-e2-firewall-hub-and-spoke.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -90,7 +90,7 @@ Before [managing workspace](workspace-management.md), you have to create:
9090
terraform {
9191
required_providers {
9292
databricks = {
93-
source = "databrickslabs/databricks"
93+
source = "databrickslabs/databricks"
9494
}
9595
aws = {
9696
source = "hashicorp/aws"

docs/guides/aws-e2-firewall-workspace.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -88,7 +88,7 @@ Before [managing workspace](workspace-management.md), you have to create:
8888
terraform {
8989
required_providers {
9090
databricks = {
91-
source = "databrickslabs/databricks"
91+
source = "databrickslabs/databricks"
9292
}
9393
aws = {
9494
source = "hashicorp/aws"

docs/resources/mlflow_webhook.md

Lines changed: 116 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,116 @@
1+
---
2+
subcategory: "MLflow"
3+
---
4+
# databricks_mlflow_webhook Resource
5+
6+
This resource allows you to create [MLflow Model Registry Webhooks](https://docs.databricks.com/applications/mlflow/model-registry-webhooks.html) in Databricks. Webhooks enable you to listen for Model Registry events so your integrations can automatically trigger actions. You can use webhooks to automate and integrate your machine learning pipeline with existing CI/CD tools and workflows. Webhooks allow trigger execution of a Databricks job or call a web service on specific event(s) that is generated in the MLflow Registry - stage transitioning, creation of registered model, creation of transition request, etc.
7+
8+
## Example Usage
9+
10+
### Triggering Databricks job
11+
12+
```hcl
13+
variable "dbhost" {
14+
description = "URL of Databricks workspace"
15+
}
16+
variable "dbtoken" {
17+
description = "Token to access Databricks workspace"
18+
}
19+
20+
data "databricks_current_user" "me" {}
21+
data "databricks_spark_version" "latest" {}
22+
data "databricks_node_type" "smallest" {
23+
local_disk = true
24+
}
25+
26+
resource "databricks_notebook" "this" {
27+
path = "${data.databricks_current_user.me.home}/MLFlowWebhook"
28+
language = "PYTHON"
29+
content_base64 = base64encode(<<-EOT
30+
import json
31+
32+
event_message = dbutils.widgets.get("event_message")
33+
event_message_dict = json.loads(event_message)
34+
print(f"event data={event_message_dict}")
35+
EOT
36+
)
37+
}
38+
39+
resource "databricks_job" "this" {
40+
name = "Terraform MLflowWebhook Demo (${data.databricks_current_user.me.alphanumeric})"
41+
42+
new_cluster {
43+
num_workers = 1
44+
spark_version = data.databricks_spark_version.latest.id
45+
node_type_id = data.databricks_node_type.smallest.id
46+
}
47+
48+
notebook_task {
49+
notebook_path = databricks_notebook.this.path
50+
}
51+
}
52+
53+
resource "databricks_mlflow_webhook" "job" {
54+
events = ["TRANSITION_REQUEST_CREATED"]
55+
description = "Databricks Job webhook trigger"
56+
status = "ACTIVE"
57+
job_spec {
58+
job_id = databricks_job.this.id
59+
workspace_url = var.dbhost
60+
access_token = var.dbtoken
61+
}
62+
}
63+
```
64+
65+
### POSTing to URL
66+
67+
```hcl
68+
resource "databricks_mlflow_webhook" "url" {
69+
events = ["TRANSITION_REQUEST_CREATED"]
70+
description = "URL webhook trigger"
71+
http_url_spec {
72+
url = "https://my_cool_host/webhook"
73+
}
74+
}
75+
```
76+
77+
78+
## Argument Reference
79+
80+
The following arguments are supported:
81+
82+
* `model_name` - (Optional) Name of MLflow model for which webhook will be created. If model name is not specified, a registry-wide webhook is created that listens for the specified events across all versions of all registered models.
83+
* `description` - Optional description of the MLflow webhook.
84+
* `status` - Optional status of webhook. Possible values are `ACTIVE`, `TEST_MODE`, `DISABLED`. Default is `ACTIVE`.
85+
* `events` - (Required) The list of events that will trigger execution of Databricks job or POSTing to an URL, for example, `MODEL_VERSION_CREATED`, `MODEL_VERSION_TRANSITIONED_STAGE`, `TRANSITION_REQUEST_CREATED`, etc. Refer to the [Webhooks API documentation](https://docs.databricks.com/dev-tools/api/latest/mlflow.html#operation/create-registry-webhook) for a full list of supported events.
86+
87+
Configuration must include one of `http_url_spec` or `job_spec` blocks, but not both.
88+
89+
### job_spec
90+
91+
* `access_token` - (Required) The personal access token used to authorize webhook's job runs.
92+
* `job_id` - (Required) ID of the Databricks job that the webhook runs.
93+
* `workspace_url` - (Optional) URL of the workspace containing the job that this webhook runs. If not specified, the job’s workspace URL is assumed to be the same as the workspace where the webhook is created.
94+
95+
### http_url_spec
96+
97+
* `url` - (Required) External HTTPS URL called on event trigger (by using a POST request). Structure of payload depends on the event type, refer to [documentation](https://docs.databricks.com/applications/mlflow/model-registry-webhooks.html) for more details.
98+
* `authorization` - (Optional) Value of the authorization header that should be sent in the request sent by the wehbook. It should be of the form `<auth type> <credentials>`, e.g. `Bearer <access_token>`. If set to an empty string, no authorization header will be included in the request.
99+
* `enable_ssl_verification` - (Optional) Enable/disable SSL certificate validation. Default is `true`. For self-signed certificates, this field must be `false` AND the destination server must disable certificate validation as well. For security purposes, it is encouraged to perform secret validation with the HMAC-encoded portion of the payload and acknowledge the risk associated with disabling hostname validation whereby it becomes more likely that requests can be maliciously routed to an unintended host.
100+
* `secret` - (Optional) Shared secret required for HMAC encoding payload. The HMAC-encoded payload will be sent in the header as `X-Databricks-Signature: encoded_payload`.
101+
102+
## Access Control
103+
104+
* MLflow webhooks could be configured only by workspace admins.
105+
106+
## Related Resources
107+
108+
The following resources are often used in the same context:
109+
110+
* [End to end workspace management](../guides/workspace-management.md) guide.
111+
* [databricks_directory](directory.md) to manage directories in [Databricks Workpace](https://docs.databricks.com/workspace/workspace-objects.html).
112+
* [databricks_mlflow_experiment](mlflow_experiment.md) to manage [MLflow experiments](https://docs.databricks.com/data/data-sources/mlflow-experiment.html) in Databricks.
113+
* [databricks_mlflow_model](mlflow_model.md) to create [MLflow models](https://docs.databricks.com/applications/mlflow/models.html) in Databricks.
114+
* [databricks_notebook](notebook.md) to manage [Databricks Notebooks](https://docs.databricks.com/notebooks/index.html).
115+
* [databricks_notebook](../data-sources/notebook.md) data to export a notebook from Databricks Workspace.
116+
* [databricks_repo](repo.md) to manage [Databricks Repos](https://docs.databricks.com/repos.html).

docs/resources/sql_global_config.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ resource "databricks_sql_global_config" "this" {
3434
"spark.hadoop.fs.azure.account.oauth2.client.endpoint" : "https://login.microsoftonline.com/${var.tenant_id}/oauth2/token"
3535
}
3636
sql_config_params = {
37-
"ANSI_MODE": "true"
37+
"ANSI_MODE" : "true"
3838
}
3939
}
4040
```

0 commit comments

Comments
 (0)