Skip to content

Commit a2df918

Browse files
authored
Prepare release v0.3.10 (#885)
1 parent 0dbd20d commit a2df918

File tree

3 files changed

+16
-1
lines changed

3 files changed

+16
-1
lines changed

CHANGELOG.md

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,14 @@
44

55
* Added `private_access_level` and `allowed_vpc_endpoint_ids` to `databricks_mws_private_access_settings` resource, which is also now updatable ([#867](https://github.com/databrickslabs/terraform-provider-databricks/issues/867)).
66
* Fixed missing diff skip for `skip_validation` in `databricks_instance_profile` ([#860](https://github.com/databrickslabs/terraform-provider-databricks/issues/860)).
7+
* Added support for `pipeline_task` ([871](https://github.com/databrickslabs/terraform-provider-databricks/pull/871)) and `python_wheel_task` ([#872](https://github.com/databrickslabs/terraform-provider-databricks/pull/872)) to `databricks_job`.
8+
* Improved enterprise HTTPS proxy support for creating workspaces in PrivateLink environments ([#882](https://github.com/databrickslabs/terraform-provider-databricks/pull/882)).
9+
* Added `hostname` attribute to `odbc_params` in `databricks_sql_endpoint` ([#868](https://github.com/databrickslabs/terraform-provider-databricks/issues/868)).
10+
* Improved documentation ([#858](https://github.com/databrickslabs/terraform-provider-databricks/pull/858), [#870](https://github.com/databrickslabs/terraform-provider-databricks/pull/870)).
11+
12+
Updated dependency versions:
13+
14+
* Bumped google.golang.org/api from 0.58.0 to 0.59.0
715

816
## 0.3.9
917

compute/acceptance/job_test.go

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -88,7 +88,7 @@ func TestAwsAccJobsCreate(t *testing.T) {
8888
assert.True(t, job.Settings.NewCluster.SparkVersion == newSparkVersion, "Something is wrong with spark version")
8989
}
9090

91-
func TestAccJobTasks(t *testing.T) {
91+
func TestPreviewAccJobTasks(t *testing.T) {
9292
acceptance.Test(t, []acceptance.Step{
9393
{
9494
Template: `

docs/resources/job.md

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -137,6 +137,13 @@ You can invoke Spark submit tasks only on new clusters. **In the `new_cluster` s
137137

138138
* `pipeline_id` - (Required) The pipeline's unique ID.
139139

140+
### python_wheel_task Configuration Block
141+
142+
* `entry_point` - (Optional) Python function as entry point for the task
143+
* `package_name` - (Optional) Name of Python package
144+
* `parameters` - (Optional) Parameters for the task
145+
* `named_parameters` - (Optional) Named parameters for the task
146+
140147
### email_notifications Configuration Block
141148

142149
* `on_failure` - (Optional) (List) list of emails to notify on failure

0 commit comments

Comments
 (0)