Skip to content

Commit 3a5fe8a

Browse files
Merge branch 'main' into JOBS-20449
2 parents 12c6c97 + 9b7ca5d commit 3a5fe8a

File tree

166 files changed

+3766
-2467
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

166 files changed

+3766
-2467
lines changed

.codegen/_openapi_sha

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
cf9c61453990df0f9453670f2fe68e1b128647a2
1+
d25296d2f4aa7bd6195c816fdf82e0f960f775da

.gitattributes

Lines changed: 41 additions & 21 deletions
Large diffs are not rendered by default.

.github/PULL_REQUEST_TEMPLATE.md

Lines changed: 26 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,28 @@
1-
## Changes
2-
<!-- Summary of your changes that are easy to understand -->
1+
## What changes are proposed in this pull request?
32

4-
## Tests
5-
<!-- How is this tested? -->
3+
Provide the readers and reviewers with the information they need to understand
4+
this PR in a comprehensive manner.
65

6+
Specifically, try to answer the two following questions:
7+
8+
- **WHAT** changes are being made in the PR? This should be a summary of the
9+
major changes to allow the reader to quickly understand the PR without having
10+
to look at the code.
11+
- **WHY** are these changes needed? This should provide the context that the
12+
reader might be missing. For example, were there any decisions behind the
13+
change that are not reflected in the code itself?
14+
15+
The “why part” is the most important of the two as it usually cannot be
16+
inferred from the code itself. A well-written PR description will help future
17+
developers (including your future self) to know how to interact and update your
18+
code.
19+
20+
## How is this tested?
21+
22+
Describe any tests you have done; especially if test tests are not part of
23+
the unit tests (e.g. local tests).
24+
25+
**ALWAYS ANSWER THIS QUESTION:** Answer with "N/A" if tests are not applicable
26+
to your PR (e.g. if the PR only modifies comments). Do not be afraid of
27+
answering "Not tested" if the PR has not been tested. Being clear about what
28+
has been done and not done provides important context to the reviewers.

CHANGELOG.md

Lines changed: 68 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,73 @@
11
# Version changelog
22

3+
## [Release] Release v0.35.0
4+
5+
### New Features and Improvements
6+
7+
* DatabricksConfig: Add clone() support ([#376](https://github.com/databricks/databricks-sdk-java/pull/376)).
8+
9+
10+
### Bug Fixes
11+
12+
* Fix vulnerabilities in the present SDK version ([#383](https://github.com/databricks/databricks-sdk-java/pull/383)).
13+
14+
15+
### Internal Changes
16+
17+
* Add test instructions for external contributors ([#370](https://github.com/databricks/databricks-sdk-java/pull/370)).
18+
* Always write message for manual test integration ([#374](https://github.com/databricks/databricks-sdk-java/pull/374)).
19+
* Automatically trigger integration tests on PR ([#369](https://github.com/databricks/databricks-sdk-java/pull/369)).
20+
* Move templates in the code generator ([#373](https://github.com/databricks/databricks-sdk-java/pull/373)).
21+
* Refresh PR template ([#381](https://github.com/databricks/databricks-sdk-java/pull/381)).
22+
23+
24+
### API Changes:
25+
26+
* Added `workspaceClient.aibiDashboardEmbeddingAccessPolicy()` service and `workspaceClient.aibiDashboardEmbeddingApprovedDomains()` service.
27+
* Added `workspaceClient.credentials()` service.
28+
* Added `appDeployment` field for `com.databricks.sdk.service.apps.CreateAppDeploymentRequest`.
29+
* Added `app` field for `com.databricks.sdk.service.apps.CreateAppRequest`.
30+
* Added `app` field for `com.databricks.sdk.service.apps.UpdateAppRequest`.
31+
* Added `table` field for `com.databricks.sdk.service.catalog.CreateOnlineTableRequest`.
32+
* Added `azureAad` field for `com.databricks.sdk.service.catalog.GenerateTemporaryTableCredentialResponse`.
33+
* Added `omitUsername` field for `com.databricks.sdk.service.catalog.ListTablesRequest`.
34+
* Added `fullName` field for `com.databricks.sdk.service.catalog.StorageCredentialInfo`.
35+
* Added `dashboard` field for `com.databricks.sdk.service.dashboards.CreateDashboardRequest`.
36+
* Added `schedule` field for `com.databricks.sdk.service.dashboards.CreateScheduleRequest`.
37+
* Added `subscription` field for `com.databricks.sdk.service.dashboards.CreateSubscriptionRequest`.
38+
* Added `warehouseId` field for `com.databricks.sdk.service.dashboards.Schedule`.
39+
* Added `dashboard` field for `com.databricks.sdk.service.dashboards.UpdateDashboardRequest`.
40+
* Added `schedule` field for `com.databricks.sdk.service.dashboards.UpdateScheduleRequest`.
41+
* Added `only` field for `com.databricks.sdk.service.jobs.RunNow`.
42+
* Added `pageToken` field for `com.databricks.sdk.service.oauth2.ListServicePrincipalSecretsRequest`.
43+
* Added `nextPageToken` field for `com.databricks.sdk.service.oauth2.ListServicePrincipalSecretsResponse`.
44+
* Added `restartWindow` field for `com.databricks.sdk.service.pipelines.CreatePipeline`.
45+
* Added `restartWindow` field for `com.databricks.sdk.service.pipelines.EditPipeline`.
46+
* Added `connectionName` field for `com.databricks.sdk.service.pipelines.IngestionGatewayPipelineDefinition`.
47+
* Added `restartWindow` field for `com.databricks.sdk.service.pipelines.PipelineSpec`.
48+
* Added `isNoPublicIpEnabled` field for `com.databricks.sdk.service.provisioning.CreateWorkspaceRequest`.
49+
* Added `privateAccessSettingsId` field for `com.databricks.sdk.service.provisioning.UpdateWorkspaceRequest`.
50+
* Added `externalCustomerInfo` and `isNoPublicIpEnabled` fields for `com.databricks.sdk.service.provisioning.Workspace`.
51+
* Added `lastUsedDay` field for `com.databricks.sdk.service.settings.TokenInfo`.
52+
* Changed `create()` method for `workspaceClient.apps()` service with new required argument order.
53+
* Changed `executeMessageQuery()` method for `workspaceClient.genie()` service . New request type is `com.databricks.sdk.service.dashboards.GenieExecuteMessageQueryRequest` class.
54+
* Changed `executeMessageQuery()` method for `workspaceClient.genie()` service to type `executeMessageQuery()` method for `workspaceClient.genie()` service.
55+
* Changed `create()`, `createSchedule()`, `createSubscription()` and `updateSchedule()` methods for `workspaceClient.lakeview()` service with new required argument order.
56+
* Removed `workspaceClient.cleanRooms()` service.
57+
* Removed `deploymentId`, `mode` and `sourceCodePath` fields for `com.databricks.sdk.service.apps.CreateAppDeploymentRequest`.
58+
* Removed `description`, `name` and `resources` fields for `com.databricks.sdk.service.apps.CreateAppRequest`.
59+
* Removed `description` and `resources` fields for `com.databricks.sdk.service.apps.UpdateAppRequest`.
60+
* Removed `name` and `spec` fields for `com.databricks.sdk.service.catalog.CreateOnlineTableRequest`.
61+
* Removed `displayName`, `parentPath`, `serializedDashboard` and `warehouseId` fields for `com.databricks.sdk.service.dashboards.CreateDashboardRequest`.
62+
* Removed `cronSchedule`, `displayName` and `pauseStatus` fields for `com.databricks.sdk.service.dashboards.CreateScheduleRequest`.
63+
* Removed `subscriber` field for `com.databricks.sdk.service.dashboards.CreateSubscriptionRequest`.
64+
* Removed `displayName`, `etag`, `serializedDashboard` and `warehouseId` fields for `com.databricks.sdk.service.dashboards.UpdateDashboardRequest`.
65+
* Removed `cronSchedule`, `displayName`, `etag` and `pauseStatus` fields for `com.databricks.sdk.service.dashboards.UpdateScheduleRequest`.
66+
* Removed `prevPageToken` field for `com.databricks.sdk.service.jobs.Run`.
67+
68+
OpenAPI SHA: d25296d2f4aa7bd6195c816fdf82e0f960f775da, Date: 2024-11-07
69+
70+
371
## [Release] Release v0.34.0
472

573
### New Features and Improvements

databricks-sdk-java/pom.xml

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
<parent>
66
<groupId>com.databricks</groupId>
77
<artifactId>databricks-sdk-parent</artifactId>
8-
<version>0.34.0</version>
8+
<version>0.35.0</version>
99
</parent>
1010
<artifactId>databricks-sdk-java</artifactId>
1111
<properties>
@@ -49,9 +49,9 @@
4949
<scope>provided</scope>
5050
</dependency>
5151
<dependency>
52-
<groupId>org.ini4j</groupId>
53-
<artifactId>ini4j</artifactId>
54-
<version>0.5.4</version>
52+
<groupId>org.apache.commons</groupId>
53+
<artifactId>commons-configuration2</artifactId>
54+
<version>2.11.0</version>
5555
<scope>compile</scope>
5656
</dependency>
5757
<dependency>
@@ -67,7 +67,7 @@
6767
<dependency>
6868
<groupId>commons-io</groupId>
6969
<artifactId>commons-io</artifactId>
70-
<version>2.13.0</version>
70+
<version>2.14.0</version>
7171
</dependency>
7272
<dependency>
7373
<groupId>org.junit.jupiter</groupId>

databricks-sdk-java/src/main/java/com/databricks/sdk/WorkspaceClient.java

Lines changed: 33 additions & 32 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

databricks-sdk-java/src/main/java/com/databricks/sdk/core/ConfigLoader.java

Lines changed: 21 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -1,16 +1,17 @@
11
package com.databricks.sdk.core;
22

33
import com.databricks.sdk.core.utils.Environment;
4-
import java.io.File;
54
import java.io.FileNotFoundException;
5+
import java.io.FileReader;
66
import java.io.IOException;
77
import java.lang.reflect.Field;
88
import java.net.MalformedURLException;
99
import java.net.URL;
1010
import java.nio.file.Paths;
1111
import java.util.*;
12-
import org.ini4j.Ini;
13-
import org.ini4j.Profile;
12+
import org.apache.commons.configuration2.INIConfiguration;
13+
import org.apache.commons.configuration2.SubnodeConfiguration;
14+
import org.apache.commons.configuration2.ex.ConfigurationException;
1415
import org.slf4j.Logger;
1516
import org.slf4j.LoggerFactory;
1617

@@ -40,7 +41,7 @@ public static DatabricksConfig resolve(DatabricksConfig cfg) throws DatabricksEx
4041
}
4142
}
4243

43-
static void loadFromEnvironmentVariables(DatabricksConfig cfg) throws IllegalAccessException {
44+
static void loadFromEnvironmentVariables(DatabricksConfig cfg) {
4445
if (cfg.getEnv() == null) {
4546
return;
4647
}
@@ -57,7 +58,7 @@ static void loadFromEnvironmentVariables(DatabricksConfig cfg) throws IllegalAcc
5758
}
5859
accessor.setValueOnConfig(cfg, env);
5960
}
60-
} catch (DatabricksException e) {
61+
} catch (DatabricksException | IllegalAccessException e) {
6162
String msg =
6263
String.format("%s auth: %s", cfg.getCredentialsProvider().authType(), e.getMessage());
6364
throw new DatabricksException(msg, e);
@@ -86,46 +87,46 @@ static void loadFromConfig(DatabricksConfig cfg) throws IllegalAccessException {
8687
configFile = configFile.replaceFirst("^~", userHome);
8788
}
8889

89-
Ini ini = parseDatabricksCfg(configFile, isDefaultConfig);
90+
INIConfiguration ini = parseDatabricksCfg(configFile, isDefaultConfig);
9091
if (ini == null) return;
92+
9193
String profile = cfg.getProfile();
9294
boolean hasExplicitProfile = !isNullOrEmpty(profile);
9395
if (!hasExplicitProfile) {
9496
profile = "DEFAULT";
9597
}
96-
97-
Profile.Section section = ini.get(profile);
98-
if (section == null && !hasExplicitProfile) {
98+
SubnodeConfiguration section = ini.getSection(profile);
99+
boolean sectionNotPresent = section == null || section.isEmpty();
100+
if (sectionNotPresent && !hasExplicitProfile) {
99101
LOG.info("{} has no {} profile configured", configFile, profile);
100102
return;
101103
}
102-
103-
if (section == null) {
104+
if (sectionNotPresent) {
104105
String msg = String.format("resolve: %s has no %s profile configured", configFile, profile);
105106
throw new DatabricksException(msg);
106107
}
107108

108109
for (ConfigAttributeAccessor accessor : accessors) {
109-
String value = section.get(accessor.getName());
110+
String value = section.getString(accessor.getName());
110111
if (!isNullOrEmpty(accessor.getValueFromConfig(cfg))) {
111112
continue;
112113
}
113114
accessor.setValueOnConfig(cfg, value);
114115
}
115116
}
116117

117-
private static Ini parseDatabricksCfg(String configFile, boolean isDefaultConfig) {
118-
Ini ini = new Ini();
119-
try {
120-
ini.load(new File(configFile));
118+
private static INIConfiguration parseDatabricksCfg(String configFile, boolean isDefaultConfig) {
119+
INIConfiguration iniConfig = new INIConfiguration();
120+
try (FileReader reader = new FileReader(configFile)) {
121+
iniConfig.read(reader);
121122
} catch (FileNotFoundException e) {
122123
if (isDefaultConfig) {
123124
return null;
124125
}
125-
} catch (IOException e) {
126+
} catch (IOException | ConfigurationException e) {
126127
throw new DatabricksException("Cannot load " + configFile, e);
127128
}
128-
return ini;
129+
return iniConfig;
129130
}
130131

131132
public static void fixHostIfNeeded(DatabricksConfig cfg) {
@@ -230,12 +231,12 @@ public static String debugString(DatabricksConfig cfg) {
230231
if (!attrsUsed.isEmpty()) {
231232
buf.add(String.format("Config: %s", String.join(", ", attrsUsed)));
232233
} else {
233-
buf.add(String.format("Config: <empty>"));
234+
buf.add("Config: <empty>");
234235
}
235236
if (!envsUsed.isEmpty()) {
236237
buf.add(String.format("Env: %s", String.join(", ", envsUsed)));
237238
} else {
238-
buf.add(String.format("Env: <none>"));
239+
buf.add("Env: <none>");
239240
}
240241
return String.join(". ", buf);
241242
} catch (IllegalAccessException e) {

0 commit comments

Comments
 (0)