Description
select_spark_version does not return the latest DB Runtime when using default parameters
Reproduction
%python
from databricks.sdk import WorkspaceClient
w = WorkspaceClient()
print(w.clusters.select_spark_version())
returns latest as DB Runtime 16 e.g. 16.4.x-scala2.12
Expected behavior
expected latest DB Runtime 17 e.g. 17.3.x-scala2.13
Is it a regression?
No
Additional context
The default scala parameter is set to 2.12 and the scala version for latest DB Runtime 17 is 2.13.
Having a default parameter for scala in this case filters the results to DB Runtime 16 by default as there is no DB Runtime 17 version available using scala 2.12