Skip to content

Releases: databricks/databricks-sql-nodejs

1.12.0

18 Sep 11:09
3f2eec1

Choose a tag to compare

What's Changed

Full Changelog: 1.11.0...1.12.0

1.11.0

19 May 18:12
57e07c1

Choose a tag to compare

What's Changed

New Contributors

Full Changelog: 1.10.0...1.11.0

1.10.0

12 Mar 16:45
6d6dd8b

Choose a tag to compare

What's Changed

New Contributors

Full Changelog: 1.9.0...1.10.0

1.9.0

16 Dec 19:38
b755e62

Choose a tag to compare

What's Changed

Full Changelog: 1.8.4...1.9.0

1.8.4

19 Apr 17:02
1e4b730

Choose a tag to compare

  • Fix: proxy agent unintentionally overwrites protocol in URL (#241)
  • Improve Array.at/TypedArray.at polyfill (#242 by @barelyhuman)
  • UC Volume ingestion: stream files instead of loading them into memory (#247)
  • UC Volume ingestion: improve behavior on SQL REMOVE (#249)
  • Expose session and query ID (#250)
  • Make lz4 module optional so package manager can skip it when cannot install (#246)

Full diff: 1.8.3...1.8.4

1.8.3

27 Mar 18:10
4592c40

Choose a tag to compare

  • Improved retry behavior (#230)
  • Fix: in some cases library returned too many results (#239)

Full diff: 1.8.2...1.8.3

1.8.2

12 Mar 19:35
17112c7

Choose a tag to compare

Improved results handling when running queries against older DBR versions (#232)

Full diff: 1.8.1...1.8.2

1.8.1

15 Feb 18:08
340a63a

Choose a tag to compare

This is a security release which addresses issues with library dependencies

https://github.com/databricks/databricks-sql-nodejs/security/dependabot/34

An issue in all published versions of the NPM package ip allows an attacker to execute arbitrary code and
obtain sensitive information via the isPublic() function. This can lead to potential Server-Side Request
Forgery (SSRF) attacks. The core issue is the function's failure to accurately distinguish between
public and private IP addresses.

1.8.0

08 Feb 12:37
bf3c9ee

Choose a tag to compare

  • Retry failed CloudFetch requests (#211)
  • Fixed compatibility issues with Node@14 (#219)
  • Support Databricks OAuth on Azure (#223) @jackyhu-db
  • Support Databricks OAuth on GCP (#224)
  • Support LZ4 compression for Arrow and CloudFetch results (#216)
  • Fix OAuth M2M flow on Azure (#228)

Full diff: 1.7.1...1.8.0

OAuth on Azure

Some Azure instances now support Databricks native OAuth flow (in addition to AAD OAuth). For a backward
compatibility, library will continue using AAD OAuth flow by default. To use Databricks native OAuth,
pass useDatabricksOAuthInAzure: true to client.connect():

client.connect({
  // other options - host, port, etc.
  authType: 'databricks-oauth',
  useDatabricksOAuthInAzure: true,
  // other OAuth options if needed
});

Also, we fixed issue with AAD OAuth when wrong scopes were passed for M2M flow.

OAuth on GCP

We enabled OAuth support on GCP instances. Since it uses Databricks native OAuth,
all the options are the same as for OAuth on AWS instances.

CloudFetch improvements

Now library will automatically attempt to retry failed CloudFetch requests. Currently, the retry strategy
is quite basic, but it is going to be improved in the future.

Also, we implemented a support for LZ4-compressed results (Arrow- and CloudFetch-based). It is enabled by default,
and compression will be used if server supports it.

1.7.1

16 Jan 18:28
c85629b

Choose a tag to compare

This release contains a fix for the "Premature close" error which happened due to socket limit when intensively using library (#217)

Full diff: 1.7.0...1.7.1