Skip to content

Commit 06e9a66

Browse files
mgyuchtclaude
andauthored
Fix selectSparkVersion() to use contains() instead of equals() (#504)
## What changes are proposed in this pull request? This PR fixes a bug in the `selectSparkVersion()` method in `ClustersExt.java` where spark version matching doesn't work with real Databricks Runtime version names. The current `equals()` implementation fails because real Databricks Runtime version names contain additional information. For example, the actual version name is `"13.3 LTS (includes Apache Spark 3.4.1, Scala 2.12)"`, not just `"Apache Spark 3.4.1"`. Both the Go SDK (`strings.Contains()`) and Python SDK (`in` operator) use substring matching for this functionality. Originally reported in PR #229 with real API response data. ## How is this tested? Added a focused unit test `sparkVersionWithSparkVersionParameter()` that demonstrates the fix works with realistic API response data. The test uses a version name in the actual format returned by the Databricks API: `"13.3 LTS (includes Apache Spark 3.4.1, Scala 2.12)"`. Fixes #229 --------- Co-authored-by: Claude <[email protected]>
1 parent 72de0ee commit 06e9a66

File tree

3 files changed

+28
-1
lines changed

3 files changed

+28
-1
lines changed

NEXT_CHANGELOG.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,8 @@
66

77
### Bug Fixes
88

9+
* Fixed `selectSparkVersion()` method to use contains() instead of equals() for spark version matching.
10+
911
### Documentation
1012

1113
### Internal Changes

databricks-sdk-java/src/main/java/com/databricks/sdk/mixin/ClustersExt.java

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,7 @@ public String selectSparkVersion(SparkVersionSelector selector) throws IllegalAr
5050
matches = version.getName().contains("LTS") || version.getKey().contains("-esr-");
5151
}
5252
if (matches && selector.sparkVersion != null) {
53-
matches = ("Apache Spark " + selector.sparkVersion).equals(version.getName());
53+
matches = version.getName().contains("Apache Spark " + selector.sparkVersion);
5454
}
5555
if (matches) {
5656
versions.add(version.getKey());

databricks-sdk-java/src/test/java/com/databricks/sdk/mixin/ClustersExtTest.java

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -118,4 +118,29 @@ void nullComparisonTest() {
118118
String nodeType = clustersExt.selectNodeType(new NodeTypeSelector().withLocalDisk());
119119
assertEquals("testId1", nodeType);
120120
}
121+
122+
private GetSparkVersionsResponse testGetSparkVersionsWithSparkVersion() {
123+
Collection<SparkVersion> versions = new ArrayList<>();
124+
// Mock realistic Databricks Runtime version based on actual API response format
125+
// The key point: version name contains more than just "Apache Spark X.Y.Z"
126+
versions.add(
127+
new SparkVersion()
128+
.setName("13.3 LTS (includes Apache Spark 3.4.1, Scala 2.12)")
129+
.setKey("13.3.x-scala2.12"));
130+
return new GetSparkVersionsResponse().setVersions(versions);
131+
}
132+
133+
@Test
134+
void sparkVersionWithSparkVersionParameter() {
135+
ClustersExt clustersExt = new ClustersExt(clustersMock);
136+
Mockito.doReturn(testGetSparkVersionsWithSparkVersion()).when(clustersMock).sparkVersions();
137+
138+
// Test that sparkVersion parameter works with realistic API response format
139+
// This tests the contains() fix - the version name is "13.3 LTS (includes Apache Spark 3.4.1,
140+
// Scala 2.12)"
141+
// not just "Apache Spark 3.4.1", so equals() would fail but contains() works
142+
String sparkVersion =
143+
clustersExt.selectSparkVersion(new SparkVersionSelector().withSparkVersion("3.4.1"));
144+
assertEquals("13.3.x-scala2.12", sparkVersion);
145+
}
121146
}

0 commit comments

Comments
 (0)