Skip to content

Commit a8c4ead

Browse files
Empty schema to null (#942)
## Description <!-- Provide a brief summary of the changes made and the issue they aim to address.--> Origin: com.databricks.sql.hive.thriftserver.DatabricksJdbcDialectTestSuite in runtime/Spark fails because the JDBC URL looks like `jdbc:databricks://sample-host.18.azuredatabricks.net:9999/;`. Note the empty string between the last slash and semi-colon. When parsing JDBC URL, we look for a schema entry between last slash (if present) and semi-colon. If it is empty, we should make it to null. ## Testing <!-- Describe how the changes have been tested--> - Unit test - com.databricks.sql.hive.thriftserver.DatabricksJdbcDialectTestSuite ## Additional Notes to the Reviewer <!-- Share any additional context or insights that may help the reviewer understand the changes better. This could include challenges faced, limitations, or compromises made during the development process. Also, mention any areas of the code that you would like the reviewer to focus on specifically. -->
1 parent 80ca302 commit a8c4ead

File tree

3 files changed

+14
-1
lines changed

3 files changed

+14
-1
lines changed

NEXT_CHANGELOG.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,5 +15,6 @@ AbstractArrowResultChunk
1515
- Fixed Statement.setMaxRows(0) to be interepeted as no limit.
1616
- Fixed retry behaviour to not throw an exception when there is no retry-after header for 503 and 429 status codes.
1717
- Fixed encoded UserAgent parsing in BI tools.
18+
- Fixed setting empty schema as the default schema in the spark session.
1819
---
1920
*Note: When making changes, please add your change under the appropriate section with a brief description.*

src/main/java/com/databricks/jdbc/api/impl/DatabricksConnectionContext.java

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -134,7 +134,8 @@ public static IDatabricksConnectionContext parse(String url, Properties properti
134134
Matcher urlMatcher = JDBC_URL_PATTERN.matcher(url);
135135
if (urlMatcher.find()) {
136136
String hostUrlVal = urlMatcher.group(1);
137-
String schema = urlMatcher.group(2);
137+
String schema =
138+
Objects.equals(urlMatcher.group(2), EMPTY_STRING) ? null : urlMatcher.group(2);
138139
String urlMinusHost = urlMatcher.group(3);
139140
String[] hostAndPort = hostUrlVal.split(DatabricksJdbcConstants.PORT_DELIMITER);
140141
String hostValue = hostAndPort[0];

src/test/java/com/databricks/jdbc/api/impl/DatabricksConnectionContextTest.java

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -153,6 +153,17 @@ public void testParseValid() throws DatabricksSQLException {
153153
assertEquals(M2M_AUTH_TYPE, connectionContext.getGcpAuthType());
154154
}
155155

156+
@Test
157+
public void testEmptySchemaConvertedToNull() throws DatabricksSQLException {
158+
String urlWithEmptySchema =
159+
"jdbc:databricks://sample-host.18.azuredatabricks.net:9999/;ssl=1;AuthMech=3;"
160+
+ "httpPath=/sql/1.0/warehouses/999999999;LogLevel=debug;LogPath=./test1;auth_flow=2";
161+
DatabricksConnectionContext connectionContext =
162+
(DatabricksConnectionContext)
163+
DatabricksConnectionContext.parse(urlWithEmptySchema, properties);
164+
assertNull(connectionContext.getSchema());
165+
}
166+
156167
@Test
157168
public void testParseValidBasicUrl() throws DatabricksSQLException {
158169
// test default AuthMech

0 commit comments

Comments
 (0)