Skip to content

Commit 809c0b3

Browse files
authored
[PECOBLR-1377] SRID in geospatial column type name (databricks#1157)
## Description This PR enhances geospatial datatype handling to include SRID (Spatial Reference System Identifier) information in column type names and fixes multiple issues related to complex datatype handling across different result formats. ### Key Changes 1. **Geospatial Type Name Enhancement** - Column type names now include SRID: `GEOMETRY(4326)` instead of `GEOMETRY` - Applies to both GEOMETRY and GEOGRAPHY types - Preserves full type information in metadata for better type identification 2. **SEA Inline Mode Complex Type Fix** - Fixed issue where complex types (ARRAY, MAP, STRUCT) were not returned as complex objects in SEA Inline mode (JSON array result format) - Now properly converts to complex datatype objects when `EnableComplexDatatypeSupport=true` 3. **Thrift CloudFetch Metadata Enhancement** - Fixed error when extracting type details (e.g., `INT` from `ARRAY<INT>`) in Thrift CloudFetch mode - Enhanced `getColumnInfoFromTColumnDesc()` to use Arrow schema metadata alongside `TColumnDesc` - Arrow schema provides complete type information (e.g., `ARRAY<INT>`) while `TColumnDesc` only contains base type (e.g., `ARRAY`) 4. **Arrow Metadata Extraction** - Added `DatabricksThriftUtil.getArrowMetadata()` to deserialize Arrow schema from `TGetResultSetMetadataResp` - Fixed null arrow metadata issue in `DatabricksResultSet` constructor for Thrift CloudFetch mode ## Testing ### Unit Tests - All existing unit tests pass and additional tests are added for new methods ### Integration Tests - `GeospatialTests.java` - Comprehensive E2E integration test - Tests geospatial types (GEOMETRY and GEOGRAPHY) - Validates **24 configuration combinations**: - Protocol: Thrift / SEA - Serialization: Arrow / Inline - CloudFetch: Enabled / Disabled (only with Arrow, as CloudFetch requires Arrow) - GeoSpatial Support: Enabled / Disabled - Complex Type Support: Enabled / Disabled - Validates metadata: column types, type names, class names - Validates values: WKT representation, SRID - Validates behavior when geospatial objects are enabled vs. disabled (STRING fallback) - **All 24 tests pass** ✅ ## Additional Notes to the Reviewer Other required details are mentioned in comments in the diff --------- Signed-off-by: Sreekanth Vadigi <sreekanth.vadigi@databricks.com>
1 parent 8a2f333 commit 809c0b3

File tree

10 files changed

+501
-83
lines changed

10 files changed

+501
-83
lines changed

NEXT_CHANGELOG.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,11 +6,15 @@
66
- Added support for disabling CloudFetch via `EnableQueryResultDownload=0` to use inline Arrow results instead.
77

88
### Updated
9+
- Geospatial column type names now include SRID information (e.g., `GEOMETRY(4326)` instead of `GEOMETRY`).
10+
- Changed default value of `IgnoreTransactions` from `0` to `1` to disable multi-statement transactions by default. Preview participants can opt-in by setting `IgnoreTransactions=0`. Also updated `supportsTransactions()` to respect this flag.
911
- Implemented lazy loading for inline Arrow results, fetching arrow batches on demand instead of all at once. This improves memory usage and initial response time for large result sets when using the Thrift protocol with Arrow format.
1012

1113
### Fixed
1214
- Fixed complex data type metadata support when retrieving 0 rows in Arrow format
1315
- Normalized TIMESTAMP_NTZ to TIMESTAMP in Thrift path for consistency with SEA behavior
16+
- Fixed complex types not being returned as objects in SEA Inline mode when `EnableComplexDatatypeSupport=true`.
17+
- Fixed `StringIndexOutOfBoundsException` when parsing complex data types in Thrift CloudFetch mode. The issue occurred when metadata contained incomplete type information (e.g., "ARRAY" instead of "ARRAY<INT>"). Now retrieves complete type information from Arrow metadata.
1418

1519
---
1620
*Note: When making changes, please add your change under the appropriate section

src/main/java/com/databricks/jdbc/api/impl/DatabricksResultSet.java

Lines changed: 15 additions & 38 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,13 @@
11
package com.databricks.jdbc.api.impl;
22

33
import static com.databricks.jdbc.common.DatabricksJdbcConstants.EMPTY_STRING;
4+
import static com.databricks.jdbc.common.util.DatabricksThriftUtil.getArrowMetadata;
45
import static com.databricks.jdbc.common.util.DatabricksTypeUtil.*;
56

67
import com.databricks.jdbc.api.IDatabricksResultSet;
78
import com.databricks.jdbc.api.IExecutionStatus;
89
import com.databricks.jdbc.api.impl.arrow.ArrowStreamResult;
910
import com.databricks.jdbc.api.impl.arrow.ChunkProvider;
10-
import com.databricks.jdbc.api.impl.arrow.LazyThriftInlineArrowResult;
1111
import com.databricks.jdbc.api.impl.converters.ConverterHelper;
1212
import com.databricks.jdbc.api.impl.converters.ObjectConverter;
1313
import com.databricks.jdbc.api.impl.volume.VolumeOperationResult;
@@ -25,10 +25,7 @@
2525
import com.databricks.jdbc.log.JdbcLogger;
2626
import com.databricks.jdbc.log.JdbcLoggerFactory;
2727
import com.databricks.jdbc.model.client.thrift.generated.TFetchResultsResp;
28-
import com.databricks.jdbc.model.core.ColumnMetadata;
29-
import com.databricks.jdbc.model.core.ResultData;
30-
import com.databricks.jdbc.model.core.ResultManifest;
31-
import com.databricks.jdbc.model.core.StatementStatus;
28+
import com.databricks.jdbc.model.core.*;
3229
import com.databricks.jdbc.model.telemetry.enums.DatabricksDriverErrorCode;
3330
import com.databricks.jdbc.telemetry.TelemetryHelper;
3431
import com.databricks.sdk.support.ToStringer;
@@ -153,12 +150,7 @@ public DatabricksResultSet(
153150
this.executionResult =
154151
ExecutionResultFactory.getResultSet(resultsResp, session, parentStatement);
155152
long rowSize = executionResult.getRowCount();
156-
List<String> arrowMetadata = null;
157-
if (executionResult instanceof ArrowStreamResult) {
158-
arrowMetadata = ((ArrowStreamResult) executionResult).getArrowMetadata();
159-
} else if (executionResult instanceof LazyThriftInlineArrowResult) {
160-
arrowMetadata = ((LazyThriftInlineArrowResult) executionResult).getArrowMetadata();
161-
}
153+
List<String> arrowMetadata = getArrowMetadata(resultsResp.getResultSetMetadata());
162154
this.resultSetMetaData =
163155
new DatabricksResultSetMetaData(
164156
statementId,
@@ -480,22 +472,6 @@ public ResultSetMetaData getMetaData() throws SQLException {
480472
return resultSetMetaData;
481473
}
482474

483-
/**
484-
* Checks if the given type name represents a complex type (ARRAY, MAP, STRUCT, GEOMETRY, or
485-
* GEOGRAPHY).
486-
*
487-
* @param typeName The type name to check
488-
* @return true if the type name starts with ARRAY, MAP, STRUCT, GEOMETRY, or GEOGRAPHY, false
489-
* otherwise
490-
*/
491-
private static boolean isComplexType(String typeName) {
492-
return typeName.startsWith(ARRAY)
493-
|| typeName.startsWith(MAP)
494-
|| typeName.startsWith(STRUCT)
495-
|| typeName.startsWith(GEOMETRY)
496-
|| typeName.startsWith(GEOGRAPHY);
497-
}
498-
499475
/**
500476
* Checks if the given type name represents a geospatial type (GEOMETRY or GEOGRAPHY).
501477
*
@@ -531,27 +507,28 @@ public Object getObject(int columnIndex) throws SQLException {
531507
}
532508

533509
private Object handleComplexDataTypes(Object obj, String columnName)
534-
throws DatabricksParsingException {
535-
if (complexDatatypeSupport) return obj;
510+
throws DatabricksSQLException {
536511
if (resultSetType == ResultSetType.SEA_INLINE) {
537-
return handleComplexDataTypesForSEAInline(obj, columnName);
512+
obj = convertToComplexDataTypesForSEAInline(obj, columnName);
538513
}
539-
return obj.toString();
514+
return complexDatatypeSupport ? obj : obj.toString();
540515
}
541516

542-
private Object handleComplexDataTypesForSEAInline(Object obj, String columnName)
543-
throws DatabricksParsingException {
517+
private Object convertToComplexDataTypesForSEAInline(Object obj, String columnName)
518+
throws DatabricksSQLException {
544519
ComplexDataTypeParser parser = new ComplexDataTypeParser();
545520
if (columnName.startsWith(ARRAY)) {
546-
return parser.parseJsonStringToDbArray(obj.toString(), columnName).toString();
521+
return parser.parseJsonStringToDbArray(obj.toString(), columnName);
547522
} else if (columnName.startsWith(MAP)) {
548-
return parser.parseJsonStringToDbMap(obj.toString(), columnName).toString();
523+
return parser.parseJsonStringToDbMap(obj.toString(), columnName);
549524
} else if (columnName.startsWith(STRUCT)) {
550-
return parser.parseJsonStringToDbStruct(obj.toString(), columnName).toString();
525+
return parser.parseJsonStringToDbStruct(obj.toString(), columnName);
551526
} else if (columnName.startsWith(GEOMETRY)) {
552-
return obj;
527+
return ConverterHelper.getConverterForColumnType(Types.OTHER, GEOMETRY)
528+
.toDatabricksGeometry(obj);
553529
} else if (columnName.startsWith(GEOGRAPHY)) {
554-
return obj;
530+
return ConverterHelper.getConverterForColumnType(Types.OTHER, GEOGRAPHY)
531+
.toDatabricksGeography(obj);
555532
}
556533
throw new DatabricksParsingException(
557534
"Unexpected metadata format. Type is not a COMPLEX: " + columnName,

src/main/java/com/databricks/jdbc/api/impl/DatabricksResultSetMetaData.java

Lines changed: 10 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -107,6 +107,12 @@ public DatabricksResultSetMetaData(
107107
typeText = "STRING";
108108
}
109109

110+
// store base type eg. DECIMAL instead of DECIMAL(7,2) except for geospatial datatypes
111+
String finalTypeText =
112+
isGeospatialType(columnTypeName)
113+
? typeText
114+
: metadataResultSetBuilder.stripTypeName(typeText);
115+
110116
int columnType = DatabricksTypeUtil.getColumnType(columnTypeName);
111117
int[] precisionAndScale = getPrecisionAndScale(columnInfo, columnType);
112118
int precision = precisionAndScale[0];
@@ -116,9 +122,7 @@ public DatabricksResultSetMetaData(
116122
.columnName(columnInfo.getName())
117123
.columnTypeClassName(DatabricksTypeUtil.getColumnTypeClassName(columnTypeName))
118124
.columnType(columnType)
119-
.columnTypeText(
120-
metadataResultSetBuilder.stripTypeName(
121-
typeText)) // store base type eg. DECIMAL instead of DECIMAL(7,2)
125+
.columnTypeText(finalTypeText)
122126
.typePrecision(precision)
123127
.typeScale(scale)
124128
.displaySize(DatabricksTypeUtil.getDisplaySize(columnTypeName, precision, scale))
@@ -187,7 +191,9 @@ public DatabricksResultSetMetaData(
187191
columnIndex++) {
188192
TColumnDesc columnDesc = resultManifest.getSchema().getColumns().get(columnIndex);
189193

190-
ColumnInfo columnInfo = getColumnInfoFromTColumnDesc(columnDesc);
194+
String columnArrowMetadata =
195+
arrowMetadata != null ? arrowMetadata.get(columnIndex) : null;
196+
ColumnInfo columnInfo = getColumnInfoFromTColumnDesc(columnDesc, columnArrowMetadata);
191197
int[] precisionAndScale = getPrecisionAndScale(columnInfo);
192198
int precision = precisionAndScale[0];
193199
int scale = precisionAndScale[1];
@@ -228,20 +234,6 @@ public DatabricksResultSetMetaData(
228234
.columnTypeClassName("java.lang.String")
229235
.columnType(Types.OTHER)
230236
.columnTypeText(VARIANT);
231-
} else if (isGeometryColumn(arrowMetadata, columnIndex)
232-
&& ctx.isGeoSpatialSupportEnabled()) {
233-
// Only set GEOMETRY type if geospatial support is enabled
234-
columnBuilder
235-
.columnTypeClassName(GEOMETRY_CLASS_NAME)
236-
.columnType(Types.OTHER)
237-
.columnTypeText(GEOMETRY);
238-
} else if (isGeographyColumn(arrowMetadata, columnIndex)
239-
&& ctx.isGeoSpatialSupportEnabled()) {
240-
// Only set GEOGRAPHY type if geospatial support is enabled
241-
columnBuilder
242-
.columnTypeClassName(GEOGRAPHY_CLASS_NAME)
243-
.columnType(Types.OTHER)
244-
.columnTypeText(GEOGRAPHY);
245237
} else if ((isGeometryColumn(arrowMetadata, columnIndex)
246238
|| isGeographyColumn(arrowMetadata, columnIndex))
247239
&& !ctx.isGeoSpatialSupportEnabled()) {

src/main/java/com/databricks/jdbc/api/impl/arrow/ArrowStreamResult.java

Lines changed: 10 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,7 @@
99
import com.databricks.jdbc.api.internal.IDatabricksSession;
1010
import com.databricks.jdbc.api.internal.IDatabricksStatementInternal;
1111
import com.databricks.jdbc.common.CompressionCodec;
12+
import com.databricks.jdbc.common.util.DatabricksThriftUtil;
1213
import com.databricks.jdbc.dbclient.IDatabricksHttpClient;
1314
import com.databricks.jdbc.dbclient.impl.common.StatementId;
1415
import com.databricks.jdbc.dbclient.impl.http.DatabricksHttpClientFactory;
@@ -336,13 +337,19 @@ public ChunkProvider getChunkProvider() {
336337
return chunkProvider;
337338
}
338339

339-
private void setColumnInfo(TGetResultSetMetadataResp resultManifest) {
340+
private void setColumnInfo(TGetResultSetMetadataResp resultManifest)
341+
throws DatabricksSQLException {
340342
columnInfos = new ArrayList<>();
343+
List<String> arrowMetadataList = DatabricksThriftUtil.getArrowMetadata(resultManifest);
341344
if (resultManifest.getSchema() == null) {
342345
return;
343346
}
344-
for (TColumnDesc tColumnDesc : resultManifest.getSchema().getColumns()) {
345-
columnInfos.add(getColumnInfoFromTColumnDesc(tColumnDesc));
347+
List<TColumnDesc> columns = resultManifest.getSchema().getColumns();
348+
for (int columnIndex = 0; columnIndex < columns.size(); columnIndex++) {
349+
TColumnDesc tColumnDesc = columns.get(columnIndex);
350+
String columnArrowMetadata =
351+
arrowMetadataList != null ? arrowMetadataList.get(columnIndex) : null;
352+
columnInfos.add(getColumnInfoFromTColumnDesc(tColumnDesc, columnArrowMetadata));
346353
}
347354
}
348355

src/main/java/com/databricks/jdbc/api/impl/arrow/LazyThriftInlineArrowResult.java

Lines changed: 10 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,15 @@
11
package com.databricks.jdbc.api.impl.arrow;
22

33
import static com.databricks.jdbc.common.EnvironmentVariables.DEFAULT_RESULT_ROW_LIMIT;
4+
import static com.databricks.jdbc.common.util.DatabricksThriftUtil.getColumnInfoFromTColumnDesc;
45
import static com.databricks.jdbc.common.util.DatabricksTypeUtil.*;
56
import static com.databricks.jdbc.common.util.DecompressionUtil.decompress;
67

78
import com.databricks.jdbc.api.impl.IExecutionResult;
89
import com.databricks.jdbc.api.internal.IDatabricksSession;
910
import com.databricks.jdbc.api.internal.IDatabricksStatementInternal;
1011
import com.databricks.jdbc.common.CompressionCodec;
12+
import com.databricks.jdbc.common.util.DatabricksThriftUtil;
1113
import com.databricks.jdbc.exception.DatabricksParsingException;
1214
import com.databricks.jdbc.exception.DatabricksSQLException;
1315
import com.databricks.jdbc.log.JdbcLogger;
@@ -418,15 +420,18 @@ private Field getArrowField(TColumnDesc columnDesc) throws SQLException {
418420
return new Field(columnDesc.getColumnName(), fieldType, null);
419421
}
420422

421-
private void setColumnInfo(TGetResultSetMetadataResp resultManifest) {
423+
private void setColumnInfo(TGetResultSetMetadataResp resultManifest)
424+
throws DatabricksSQLException {
422425
columnInfos = new ArrayList<>();
423426
if (resultManifest.getSchema() == null) {
424427
return;
425428
}
426-
for (TColumnDesc tColumnDesc : resultManifest.getSchema().getColumns()) {
427-
columnInfos.add(
428-
com.databricks.jdbc.common.util.DatabricksThriftUtil.getColumnInfoFromTColumnDesc(
429-
tColumnDesc));
429+
List<String> arrowMetadata = DatabricksThriftUtil.getArrowMetadata(resultManifest);
430+
List<TColumnDesc> columns = resultManifest.getSchema().getColumns();
431+
for (int columnIndex = 0; columnIndex < columns.size(); columnIndex++) {
432+
TColumnDesc tColumnDesc = columns.get(columnIndex);
433+
String columnArrowMetadata = arrowMetadata != null ? arrowMetadata.get(columnIndex) : null;
434+
columnInfos.add(getColumnInfoFromTColumnDesc(tColumnDesc, columnArrowMetadata));
430435
}
431436
}
432437

src/main/java/com/databricks/jdbc/api/impl/converters/ArrowToJavaObjectConverter.java

Lines changed: 5 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,7 @@
1414
import java.math.RoundingMode;
1515
import java.sql.Date;
1616
import java.sql.Timestamp;
17+
import java.sql.Types;
1718
import java.time.*;
1819
import java.time.format.DateTimeFormatter;
1920
import java.time.format.DateTimeParseException;
@@ -140,8 +141,11 @@ public static Object convert(
140141
IntervalConverter ic = new IntervalConverter(arrowMetadata);
141142
return ic.toLiteral(object);
142143
case GEOMETRY:
144+
return ConverterHelper.getConverterForColumnType(Types.OTHER, GEOMETRY)
145+
.toDatabricksGeometry(object);
143146
case GEOGRAPHY:
144-
return convertToGeospatial(object, requiredType);
147+
return ConverterHelper.getConverterForColumnType(Types.OTHER, GEOGRAPHY)
148+
.toDatabricksGeography(object);
145149
case NULL:
146150
return null;
147151
default:
@@ -169,21 +173,6 @@ private static Object convertToStruct(Object object, String arrowMetadata)
169173
return parser.parseJsonStringToDbStruct(object.toString(), arrowMetadata);
170174
}
171175

172-
private static AbstractDatabricksGeospatial convertToGeospatial(
173-
Object object, ColumnInfoTypeName type) throws DatabricksSQLException {
174-
String ewkt = convertToString(object);
175-
176-
// Parse EWKT to extract SRID from data
177-
// SRID is always present in EWKT unless it's 0, in which case it is handled in
178-
// WKTConverter.extractSRIDFromEWKT()
179-
int srid = WKTConverter.extractSRIDFromEWKT(ewkt);
180-
String cleanWkt = WKTConverter.removeSRIDFromEWKT(ewkt);
181-
182-
return type == ColumnInfoTypeName.GEOMETRY
183-
? new DatabricksGeometry(cleanWkt, srid)
184-
: new DatabricksGeography(cleanWkt, srid);
185-
}
186-
187176
private static Object convertToTimestamp(Object object, Optional<String> timeZoneOpt)
188177
throws DatabricksSQLException {
189178
if (object instanceof Text) {

src/main/java/com/databricks/jdbc/common/util/DatabricksThriftUtil.java

Lines changed: 49 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
11
package com.databricks.jdbc.common.util;
22

3+
import static com.databricks.jdbc.common.DatabricksJdbcConstants.ARROW_METADATA_KEY;
34
import static com.databricks.jdbc.common.EnvironmentVariables.DEFAULT_RESULT_ROW_LIMIT;
45
import static com.databricks.jdbc.common.util.DatabricksTypeUtil.*;
56
import static com.databricks.jdbc.model.client.thrift.generated.TTypeId.*;
@@ -20,9 +21,14 @@
2021
import com.databricks.jdbc.model.core.StatementStatus;
2122
import com.databricks.jdbc.model.telemetry.enums.DatabricksDriverErrorCode;
2223
import com.databricks.sdk.service.sql.StatementState;
24+
import java.io.IOException;
2325
import java.nio.ByteBuffer;
2426
import java.time.Instant;
2527
import java.util.*;
28+
import java.util.stream.Collectors;
29+
import org.apache.arrow.vector.types.pojo.Field;
30+
import org.apache.arrow.vector.types.pojo.Schema;
31+
import org.apache.arrow.vector.util.SchemaUtility;
2632

2733
public class DatabricksThriftUtil {
2834

@@ -210,16 +216,30 @@ public static String getTypeTextFromTypeDesc(TTypeDesc typeDesc) {
210216
return primitiveTypeEntry.getType().name().replace("_TYPE", "");
211217
}
212218

213-
public static ColumnInfo getColumnInfoFromTColumnDesc(TColumnDesc columnDesc) {
219+
public static ColumnInfo getColumnInfoFromTColumnDesc(
220+
TColumnDesc columnDesc, String arrowMetadata) {
214221
TPrimitiveTypeEntry primitiveTypeEntry = getTPrimitiveTypeOrDefault(columnDesc.getTypeDesc());
215222
ColumnInfoTypeName columnInfoTypeName =
216223
T_TYPE_ID_COLUMN_INFO_TYPE_NAME_MAP.get(primitiveTypeEntry.getType());
224+
225+
String typeText = getTypeTextFromTypeDesc(columnDesc.getTypeDesc());
226+
227+
if (arrowMetadata != null && isComplexType(arrowMetadata)) {
228+
typeText = arrowMetadata;
229+
if (arrowMetadata.startsWith(GEOMETRY)) {
230+
columnInfoTypeName = ColumnInfoTypeName.GEOMETRY;
231+
} else if (arrowMetadata.startsWith(GEOGRAPHY)) {
232+
columnInfoTypeName = ColumnInfoTypeName.GEOGRAPHY;
233+
}
234+
}
235+
217236
ColumnInfo columnInfo =
218237
new ColumnInfo()
219238
.setName(columnDesc.getColumnName())
220239
.setPosition((long) columnDesc.getPosition())
221240
.setTypeName(columnInfoTypeName)
222-
.setTypeText(getTypeTextFromTypeDesc(columnDesc.getTypeDesc()));
241+
.setTypeText(typeText);
242+
223243
if (primitiveTypeEntry.isSetTypeQualifiers()) {
224244
TTypeQualifiers typeQualifiers = primitiveTypeEntry.getTypeQualifiers();
225245
String scaleQualifierKey = TCLIServiceConstants.SCALE,
@@ -371,4 +391,31 @@ public static void checkDirectResultsForErrorStatus(
371391
verifySuccessStatus(directResults.getResultSet().getStatus(), context, statementId);
372392
}
373393
}
394+
395+
/**
396+
* Deserializes the Arrow schema from TGetResultSetMetadataResp.
397+
*
398+
* @param metadata the TGetResultSetMetadataResp containing the binary Arrow schema
399+
* @return the deserialized Arrow Schema
400+
*/
401+
public static List<String> getArrowMetadata(TGetResultSetMetadataResp metadata)
402+
throws DatabricksSQLException {
403+
if (metadata == null
404+
|| metadata.getArrowSchema() == null
405+
|| metadata.getArrowSchema().length == 0) {
406+
return null;
407+
}
408+
byte[] arrowSchemaBytes = metadata.getArrowSchema();
409+
try {
410+
Schema arrowSchema = SchemaUtility.deserialize(arrowSchemaBytes, null);
411+
return arrowSchema.getFields().stream()
412+
.map(Field::getMetadata)
413+
.map(e -> e.get(ARROW_METADATA_KEY))
414+
.collect(Collectors.toList());
415+
} catch (IOException e) {
416+
String errorMessage = "Failed to deserialize Arrow schema: " + e.getMessage();
417+
LOGGER.error(errorMessage, e);
418+
throw new DatabricksSQLException(errorMessage, e, DatabricksDriverErrorCode.RESULT_SET_ERROR);
419+
}
420+
}
374421
}

src/main/java/com/databricks/jdbc/common/util/DatabricksTypeUtil.java

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -555,4 +555,20 @@ public static String getDecimalTypeString(BigDecimal bd) {
555555
}
556556
return DECIMAL + "(" + precision + "," + scale + ")";
557557
}
558+
559+
/**
560+
* Checks if the given type name represents a complex type (ARRAY, MAP, STRUCT, GEOMETRY, or
561+
* GEOGRAPHY).
562+
*
563+
* @param typeName The type name to check
564+
* @return true if the type name starts with ARRAY, MAP, STRUCT, GEOMETRY, or GEOGRAPHY, false
565+
* otherwise
566+
*/
567+
public static boolean isComplexType(String typeName) {
568+
return typeName.startsWith(ARRAY)
569+
|| typeName.startsWith(MAP)
570+
|| typeName.startsWith(STRUCT)
571+
|| typeName.startsWith(GEOMETRY)
572+
|| typeName.startsWith(GEOGRAPHY);
573+
}
558574
}

0 commit comments

Comments
 (0)