Skip to content

Commit 14cbbe2

Browse files
authored
♻️ Refactor: generalize entra pass-trough to oauth pass trough (#115)
1 parent ade1195 commit 14cbbe2

File tree

6 files changed

+70
-57
lines changed

6 files changed

+70
-57
lines changed

README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,7 @@ To configure the plugin use the values provided under JDBC/ODBC in the advanced
6969
- [Personal Access Token (PAT)](https://docs.databricks.com/en/dev-tools/auth/pat.html)
7070
- [Databricks M2M OAuth](https://docs.databricks.com/en/dev-tools/auth/oauth-m2m.html) using a Service Principal Client ID and Client Secret
7171
- External OAuth Client Credential Endpoint which returns a Databricks token (the OAuth endpoint should implement the default [OAuth Client Credential Grant](https://datatracker.ietf.org/doc/html/rfc6749#section-4.4)) i.e. Azure Entra (OAuth2 Endpoint `https://login.microsoftonline.com/{tenant-id}/oauth2/v2.0/token` & Scope `2ff814a6-3304-4ab8-85cb-cd0e6f879c1d/.default`)
72-
- Azure Entra Pass Thru, which uses the Entra Auth token from the signed in user (IMPORTANT: `2ff814a6-3304-4ab8-85cb-cd0e6f879c1d/.default` has to be added to the scopes of the Entra Auth configuration in Grafana!). Additionally the plugin won't work with this option selected if the user is not signed in via Azure Entra SSO and for backend Grafana Tasks (e.g.Alerting).
72+
- OAuth2 pass-trough, which forwards the Grafana SSO OAuth (i.e. Azure AD/Entra) token from the signed-in user to the plugin. Make sure to set the correct scope in the SSO OAuth configuration of Grafana for your Auth provider. i.E. for Azure AD/Entra SSO Auth the scope `2ff814a6-3304-4ab8-85cb-cd0e6f879c1d/.default` (AzureDatabricks/user_impersonation) has to be added to the scopes of the Auth configuration in Grafana! Additionally, the plugin won't work with this option selected if the user is not signed in SSO and for backend Grafana Tasks (e.g.Alerting).
7373

7474
![img_1.png](img/config_editor.png)
7575

@@ -80,7 +80,7 @@ Available configuration fields are as follows:
8080
| Server Hostname | Databricks Server Hostname (without http). i.e. `XXX.cloud.databricks.com` |
8181
| Server Port | Databricks Server Port (default `443`) |
8282
| HTTP Path | HTTP Path value for the existing cluster or SQL warehouse. i.e. `sql/1.0/endpoints/XXX` |
83-
| Authentication Method | PAT (Personal Access Token), M2M (Machine to Machine) OAuth, OAuth2 Client Credentials Authentication or Azure Entra Pass Thru |
83+
| Authentication Method | PAT (Personal Access Token), M2M (Machine to Machine) OAuth, OAuth2 Client Credentials Authentication or OAuth pass-through |
8484
| Client ID | Databricks Service Principal Client ID. (only if OAuth / OAuth2 is chosen as Auth Method) |
8585
| Client Secret | Databricks Service Principal Client Secret. (only if OAuth / OAuth2 is chosen as Auth Method) |
8686
| Access Token | Personal Access Token for Databricks. (only if PAT is chosen as Auth Method) |
@@ -113,7 +113,7 @@ datasources:
113113
hostname: XXX.cloud.databricks.com
114114
httpPath: sql/1.0/endpoints/XXX
115115
port: 443
116-
authenticationMethod: dsn (=PAT) | m2m | oauth2_client_credentials | azure_entra_pass_thru
116+
authenticationMethod: dsn (=PAT) | m2m | oauth2_client_credentials | oauth2_pass_through
117117
clientId: ...
118118
externalCredentialsUrl: ...
119119
oauthScopes: api,read

docs/README.md

Lines changed: 28 additions & 28 deletions
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,7 @@ To configure the plugin use the values provided under JDBC/ODBC in the advanced
6969
- [Personal Access Token (PAT)](https://docs.databricks.com/en/dev-tools/auth/pat.html)
7070
- [Databricks M2M OAuth](https://docs.databricks.com/en/dev-tools/auth/oauth-m2m.html) using a Service Principal Client ID and Client Secret
7171
- External OAuth Client Credential Endpoint which returns a Databricks token (the OAuth endpoint should implement the default [OAuth Client Credential Grant](https://datatracker.ietf.org/doc/html/rfc6749#section-4.4)) i.e. Azure Entra (OAuth2 Endpoint `https://login.microsoftonline.com/{tenant-id}/oauth2/v2.0/token` & Scope `2ff814a6-3304-4ab8-85cb-cd0e6f879c1d/.default`)
72-
- Azure Entra Pass Thru, which uses the Entra Auth token from the signed in user (IMPORTANT: `2ff814a6-3304-4ab8-85cb-cd0e6f879c1d/.default` has to be added to the scopes of the Entra Auth configuration in Grafana!). Additionally the plugin won't work with this option selected if the user is not signed in via Azure Entra SSO and for backend Grafana Tasks (e.g.Alerting).
72+
- OAuth2 pass-trough, which forwards the Grafana SSO OAuth (i.e. Azure AD/Entra) token from the signed-in user to the plugin. Make sure to set the correct scope in the SSO OAuth configuration of Grafana for your Auth provider. i.E. for Azure AD/Entra SSO Auth the scope `2ff814a6-3304-4ab8-85cb-cd0e6f879c1d/.default` (AzureDatabricks/user_impersonation) has to be added to the scopes of the Auth configuration in Grafana! Additionally, the plugin won't work with this option selected if the user is not signed in SSO and for backend Grafana Tasks (e.g.Alerting).
7373

7474
![img_1.png](img/config_editor.png)
7575

@@ -80,7 +80,7 @@ Available configuration fields are as follows:
8080
| Server Hostname | Databricks Server Hostname (without http). i.e. `XXX.cloud.databricks.com` |
8181
| Server Port | Databricks Server Port (default `443`) |
8282
| HTTP Path | HTTP Path value for the existing cluster or SQL warehouse. i.e. `sql/1.0/endpoints/XXX` |
83-
| Authentication Method | PAT (Personal Access Token), M2M (Machine to Machine) OAuth, OAuth2 Client Credentials Authentication or Azure Entra Pass Thru |
83+
| Authentication Method | PAT (Personal Access Token), M2M (Machine to Machine) OAuth, OAuth2 Client Credentials Authentication or OAuth2 pass-through |
8484
| Client ID | Databricks Service Principal Client ID. (only if OAuth / OAuth2 is chosen as Auth Method) |
8585
| Client Secret | Databricks Service Principal Client Secret. (only if OAuth / OAuth2 is chosen as Auth Method) |
8686
| Access Token | Personal Access Token for Databricks. (only if PAT is chosen as Auth Method) |
@@ -106,32 +106,32 @@ The Datasource configuration can also be done via a YAML file as described [here
106106

107107
```yaml
108108
datasources:
109-
- name: Databricks
110-
type: mullerpeter-databricks-datasource
111-
isDefault: true
112-
jsonData:
113-
hostname: XXX.cloud.databricks.com
114-
httpPath: sql/1.0/endpoints/XXX
115-
port: 443
116-
authenticationMethod: dsn (=PAT) | m2m | oauth2_client_credentials | azure_entra_pass_thru
117-
clientId: ...
118-
externalCredentialsUrl: ...
119-
oauthScopes: api,read
120-
timeInterval: 1m
121-
maxOpenConns: 0
122-
maxIdleConns: 0
123-
connMaxLifetime: 3600
124-
connMaxIdleTime: 3600
125-
retries: 3
126-
retryBackoff: 1
127-
maxRetryDuration: 60
128-
timeout: 60
129-
maxRows: 10000
130-
defaultQueryFormat: table | time_series
131-
defaultEditorMode: builder | code
132-
secureJsonData:
133-
clientSecret: ...
134-
token: ...
109+
- name: Databricks
110+
type: mullerpeter-databricks-datasource
111+
isDefault: true
112+
jsonData:
113+
hostname: XXX.cloud.databricks.com
114+
httpPath: sql/1.0/endpoints/XXX
115+
port: 443
116+
authenticationMethod: dsn (=PAT) | m2m | oauth2_client_credentials | oauth2_pass_through
117+
clientId: ...
118+
externalCredentialsUrl: ...
119+
oauthScopes: api,read
120+
timeInterval: 1m
121+
maxOpenConns: 0
122+
maxIdleConns: 0
123+
connMaxLifetime: 3600
124+
connMaxIdleTime: 3600
125+
retries: 3
126+
retryBackoff: 1
127+
maxRetryDuration: 60
128+
timeout: 60
129+
maxRows: 10000
130+
defaultQueryFormat: table | time_series
131+
defaultEditorMode: builder | code
132+
secureJsonData:
133+
clientSecret: ...
134+
token: ...
135135
```
136136
### Supported Macros
137137

package.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
{
22
"name": "mullerpeter-databricks-datasource",
33
"private": true,
4-
"version": "1.3.6",
4+
"version": "1.3.7",
55
"description": "Databricks SQL Connector",
66
"scripts": {
77
"build": "webpack -c ./.config/webpack/webpack.config.ts --env production",
Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -27,15 +27,15 @@ func (ts *TokenStorage) Update(newToken string) {
2727
ts.token = newToken
2828
}
2929

30-
type Authenticator struct {
30+
type OAuthPassThroughAuthenticator struct {
3131
tokenStorage *TokenStorage
3232
}
3333

34-
func NewAuthenticator(ts *TokenStorage) *Authenticator {
35-
return &Authenticator{tokenStorage: ts}
34+
func NewOAuthPassThroughAuthenticator(ts *TokenStorage) *OAuthPassThroughAuthenticator {
35+
return &OAuthPassThroughAuthenticator{tokenStorage: ts}
3636
}
3737

38-
func (a *Authenticator) Authenticate(r *http.Request) error {
38+
func (a *OAuthPassThroughAuthenticator) Authenticate(r *http.Request) error {
3939
if a.tokenStorage.Get() == "" {
4040
return fmt.Errorf("Empty Token Pass Trough")
4141
}

pkg/plugin/plugin.go

Lines changed: 15 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -90,7 +90,7 @@ func NewSampleDatasource(ctx context.Context, settings backend.DataSourceInstanc
9090
port = portInt
9191
}
9292

93-
if datasourceSettings.AuthenticationMethod == "m2m" || datasourceSettings.AuthenticationMethod == "oauth2_client_credentials" || datasourceSettings.AuthenticationMethod == "azure_entra_pass_thru" {
93+
if datasourceSettings.AuthenticationMethod == "m2m" || datasourceSettings.AuthenticationMethod == "oauth2_client_credentials" || datasourceSettings.AuthenticationMethod == "azure_entra_pass_thru" || datasourceSettings.AuthenticationMethod == "oauth2_pass_through" {
9494
var authenticator auth.Authenticator
9595
var tokenStorage *integrations.TokenStorage
9696

@@ -112,9 +112,9 @@ func NewSampleDatasource(ctx context.Context, settings backend.DataSourceInstanc
112112
datasourceSettings.Hostname,
113113
[]string{},
114114
)
115-
} else if datasourceSettings.AuthenticationMethod == "azure_entra_pass_thru" {
115+
} else if datasourceSettings.AuthenticationMethod == "azure_entra_pass_thru" || datasourceSettings.AuthenticationMethod == "oauth2_pass_through" {
116116
tokenStorage = integrations.NewTokenStorage("")
117-
authenticator = integrations.NewAuthenticator(tokenStorage)
117+
authenticator = integrations.NewOAuthPassThroughAuthenticator(tokenStorage)
118118
} else {
119119
log.DefaultLogger.Info("Authentication Method Parse Error", "err", nil)
120120
return nil, fmt.Errorf("authentication Method Parse Error")
@@ -304,6 +304,11 @@ type Datasource struct {
304304
}
305305

306306
func (d *Datasource) CallResource(ctx context.Context, req *backend.CallResourceRequest, sender backend.CallResourceResponseSender) error {
307+
err := d.CheckOAuthPassTrough(req.GetHTTPHeader(backend.OAuthIdentityTokenHeaderName))
308+
if err != nil {
309+
log.DefaultLogger.Error("OAuth2 Pass Through Authentication failed", "err", err)
310+
return err
311+
}
307312
return autocompletionQueries(ctx, req, sender, d)
308313
}
309314

@@ -321,9 +326,9 @@ func (d *Datasource) Dispose() {
321326
func (d *Datasource) QueryData(ctx context.Context, req *backend.QueryDataRequest) (*backend.QueryDataResponse, error) {
322327
log.DefaultLogger.Info("QueryData called", "request", req)
323328

324-
err := d.CheckAzureEntraPassThru(req.GetHTTPHeader(backend.OAuthIdentityTokenHeaderName))
329+
err := d.CheckOAuthPassTrough(req.GetHTTPHeader(backend.OAuthIdentityTokenHeaderName))
325330
if err != nil {
326-
log.DefaultLogger.Error("Azure Entra Connection Failed", "err", err)
331+
log.DefaultLogger.Error("OAuth2 Pass Through Authentication failed", "err", err)
327332
return nil, err
328333
}
329334

@@ -428,15 +433,15 @@ func (d *Datasource) query(ctx context.Context, pCtx backend.PluginContext, quer
428433
return response
429434
}
430435

431-
func (d *Datasource) CheckAzureEntraPassThru(token string) error {
436+
func (d *Datasource) CheckOAuthPassTrough(token string) error {
432437

433-
if d.authMethod != "azure_entra_pass_thru" {
438+
if d.authMethod != "azure_entra_pass_thru" && d.authMethod != "oauth2_pass_through" {
434439
return nil
435440
}
436441

437442
if token == "" {
438443
log.DefaultLogger.Info("Token is empty")
439-
return fmt.Errorf("no Azure Entra Token provided")
444+
return fmt.Errorf("No OAuth Token passed through")
440445
}
441446
if token != d.tokenStorage.Get() {
442447
log.DefaultLogger.Info("Token updated")
@@ -453,11 +458,11 @@ func (d *Datasource) CheckAzureEntraPassThru(token string) error {
453458
func (d *Datasource) CheckHealth(ctx context.Context, req *backend.CheckHealthRequest) (*backend.CheckHealthResult, error) {
454459
log.DefaultLogger.Info("CheckHealth called", "request", req)
455460

456-
err := d.CheckAzureEntraPassThru(req.GetHTTPHeader(backend.OAuthIdentityTokenHeaderName))
461+
err := d.CheckOAuthPassTrough(req.GetHTTPHeader(backend.OAuthIdentityTokenHeaderName))
457462
if err != nil {
458463
return &backend.CheckHealthResult{
459464
Status: backend.HealthStatusError,
460-
Message: fmt.Sprintf("Azure Entra Connection Failed: %s", err),
465+
Message: fmt.Sprintf("OAuth2 Pass Through Authentication failed: %s", err),
461466
}, nil
462467
}
463468

src/components/ConfigEditor/ConfigEditor.tsx

Lines changed: 19 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -35,10 +35,17 @@ export class ConfigEditor extends PureComponent<Props, State> {
3535
...jsonData,
3636
[key]: value
3737
}
38-
if (key == 'authenticationMethod' && value == 'azure_entra_pass_thru') {
39-
jsonData = {
40-
...jsonData,
41-
oauthPassThru: true,
38+
if (key == 'authenticationMethod') {
39+
if (value == 'oauth2_pass_through') {
40+
jsonData = {
41+
...jsonData,
42+
oauthPassThru: true,
43+
}
44+
} else {
45+
jsonData = {
46+
...jsonData,
47+
oauthPassThru: false,
48+
}
4249
}
4350
}
4451
onOptionsChange({
@@ -127,8 +134,8 @@ export class ConfigEditor extends PureComponent<Props, State> {
127134
label: 'OAuth2 Client Credentials',
128135
},
129136
{
130-
value: 'azure_entra_pass_thru',
131-
label: 'Pass Thru Azure Entra Auth',
137+
value: 'oauth2_pass_through',
138+
label: 'OAuth2 pass-through',
132139
},
133140
]}
134141
value={jsonData.authenticationMethod || 'dsn'}
@@ -178,7 +185,7 @@ export class ConfigEditor extends PureComponent<Props, State> {
178185
/>
179186
</InlineField>
180187
</>
181-
) : jsonData.authenticationMethod != 'azure_entra_pass_thru' && (
188+
) : jsonData.authenticationMethod != 'oauth2_pass_through' && (
182189
<InlineField label="Access Token" labelWidth={30} tooltip="Databricks Personal Access Token">
183190
<SecretInput
184191
isConfigured={(secureJsonFields && secureJsonFields.token) as boolean}
@@ -190,10 +197,11 @@ export class ConfigEditor extends PureComponent<Props, State> {
190197
/>
191198
</InlineField>
192199
)}
193-
{jsonData.authenticationMethod === 'azure_entra_pass_thru' && (
194-
<Alert title="Pass Thru Azure Entra Auth" severity="info">
195-
<p>Pass Thru Azure Entra Auth only works if Azure Entra Auth is setup in Grafana and the user is signed in via Azure Entra SSO. (i.e. Alerts and other backend tasks won't work)</p>
196-
<p>Make sure to set the correct permissions for the Databricks workspace and the SQL warehouse. And add the following Databricks Scope in the Grafana Azure Entra Auth Configuration Settings: "2ff814a6-3304-4ab8-85cb-cd0e6f879c1d/.default"</p>
200+
{jsonData.authenticationMethod === 'oauth2_pass_through' && (
201+
<Alert title="OAuth2 pass-trough" severity="info">
202+
<p>OAuth2 pass-trough only works if SSO Auth (i.e. Azure AD/Entra) is setup in Grafana and the user is signed in via SSO. (i.e. Alerts and other backend tasks won't work)</p>
203+
<p>Make sure to set the correct permissions for the Databricks workspace and the SQL warehouse and add the correct scope in the Grafana Authentication Settings for you SSO Auth Provider.</p>
204+
<p>i.E. for Azure AD/Entra SSO Auth the following scope has to be added: "2ff814a6-3304-4ab8-85cb-cd0e6f879c1d/.default" (AzureDatabricks/user_impersonation)</p>
197205
</Alert>
198206
)}
199207
<hr/>

0 commit comments

Comments
 (0)