You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/synapse-analytics/sql/resources-self-help-sql-on-demand.md
+20-19Lines changed: 20 additions & 19 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -27,14 +27,18 @@ If Synapse Studio can't establish connection to serverless SQL pool, you'll noti
27
27
1) Your network prevents communication to Azure Synapse backend. Most frequent case is that port 1443 is blocked. To get the serverless SQL pool to work, unblock this port. Other problems could prevent serverless SQL pool to work as well, [visit full troubleshooting guide for more information](../troubleshoot/troubleshoot-synapse-studio.md).
28
28
2) You don't have permissions to log into serverless SQL pool. To gain access, one of the Azure Synapse workspace administrators should add you to workspace administrator or SQL administrator role. [Visit full guide on access control for more information](../security/synapse-workspace-access-control-overview.md).
29
29
30
-
### Websocket connection was closed unexpectedly.
30
+
### Websocket connection was closed unexpectedly
31
31
32
32
If your query fails with the error message: 'Websocket connection was closed unexpectedly', it means that your browser connection to Synapse Studio was interrupted, for example because of a network issue.
33
33
34
34
To resolve this issue, rerun this query. If this message occurs often in your environment, advise help from your network administrator, check firewall settings, and [visit this troubleshooting guide for more information](../troubleshoot/troubleshoot-synapse-studio.md).
35
35
36
36
If the issue still continues, create a [support ticket](../../azure-portal/supportability/how-to-create-azure-support-request.md) through the Azure portal and try [Azure Data Studio](/sql/azure-data-studio/download-azure-data-studio) or [SQL Server Management Studio](/sql/ssms/download-sql-server-management-studio-ssms) for the same queries instead of Synapse Studio for further investigation.
37
37
38
+
### Serverless databases are not shown in Synapse studio
39
+
40
+
If you do not see the databases that are created in serverless SQL pool, check is your serverless SQL pool started. If the serverless SQL pool is deactivated, the databases will not be shown. Execute any query (for example `SELECT 1`) on the serverless pool to activate it, and the databases will be shown.
41
+
38
42
## Storage access
39
43
40
44
If you are getting the errors while trying to access the files on storage, make sure that you have permissions to access data. You should be able to access publicly available files. If you are accessing data without credentials, make sure that your Azure AD identity can directly access the files.
@@ -145,11 +149,11 @@ If your query fails with the error message 'This query can't be executed due to
145
149
146
150
### Query timeout expired
147
151
148
-
The **Query timeout expired** error is returned if the query executed more than 30 minutes on serverless SQL pool. This is a limit of serverless SQL pool that cannot be changed. Try to optimize your query by applying [best practices](best-practices-serverless-sql-pool.md#prepare-files-for-querying), or try to materialize parts of your queries using [CETAS](create-external-table-as-select.md). Check is there a concurrent workload running on the serverless pool because the other queries might take the resources. In that case you might split the workload on multiple workspaces.
152
+
The error *Query timeout expired* is returned if the query executed more than 30 minutes on serverless SQL pool. This is a limit of serverless SQL pool that cannot be changed. Try to optimize your query by applying [best practices](best-practices-serverless-sql-pool.md#prepare-files-for-querying), or try to materialize parts of your queries using [CETAS](create-external-table-as-select.md). Check is there a concurrent workload running on the serverless pool because the other queries might take the resources. In that case you might split the workload on multiple workspaces.
149
153
150
154
### Invalid object name
151
155
152
-
This error indicates that you are using an object (table or view) that doesn't exist in the serverless SQL pool database.
156
+
The error*Invalid object name 'table name'* indicates that you are using an object (table or view) that doesn't exist in the serverless SQL pool database.
153
157
- List the tables/views and check does the object exists. Use SSMS or ADS because Synapse studio might show some tables that are not available in the serverless SQL pool.
154
158
- If you see the object, check are you using some case-sensitive/binary database collation. Maybe the object name does not match the name that you used in the query. With a binary database collation, `Employee` and `employee` are two different objects.
155
159
- If you don't see the object, maybe you are trying to query a table from a Lake/Spark database. There are a few reasons why the table might not be available in the serverless pool:
@@ -158,11 +162,12 @@ This error indicates that you are using an object (table or view) that doesn't e
158
162
159
163
### Could not allocate tempdb space while transferring data from one distribution to another
160
164
161
-
This error is special case of the generic [query fails because it cannot be executed due to current resource constraints](#query-fails-because-it-cannot-be-executed-due-to-current-resource-constraints) error. This error is returned when the resources allocated to the `tempdb` database are insufficient to run the query.
165
+
The error *Could not allocate tempdb space while transferring data from one distribution to another* is returned when the query execution engine cannot process data and transfer it between the nodes that are executing the query.
166
+
It is a special case of the generic [query fails because it cannot be executed due to current resource constraints](#query-fails-because-it-cannot-be-executed-due-to-current-resource-constraints) error. This error is returned when the resources allocated to the `tempdb` database are insufficient to run the query.
162
167
163
-
Apply the same mitigation and the best practices before you file a support ticket.
168
+
Apply the best practices before you file a support ticket.
164
169
165
-
### Query fails with error while handling an external file
170
+
### Query fails with error while handling an external file (max error count reached)
166
171
167
172
If your query fails with the error message 'error handling external file: Max errors count reached', it means that there is a mismatch of a specified column type and the data that needs to be loaded.
168
173
To get more information about the error and which rows and columns to look at, change the parser version from ‘2.0’ to ‘1.0’.
@@ -236,7 +241,7 @@ FROM
236
241
237
242
### Cannot bulk load because the file could not be opened
238
243
239
-
This error is returned if a file is modified during the query execution. Usually, you are getting an error like:
244
+
The error*Cannot bulk load because the file could not be opened* is returned if a file is modified during the query execution. Usually, you are getting an error like:
240
245
`Cannot bulk load because the file {file path} could not be opened. Operating system error code 12(The access code is invalid.).`
241
246
242
247
The serverless sql pools cannot read the files that are modified while the query is running. The query cannot take a lock on the files.
@@ -492,20 +497,20 @@ FROM
492
497
493
498
### The query references an object that is not supported in distributed processing mode
494
499
495
-
Some objects (such as system views) and functions cannot be used while querying data stored in Azure data lake or Cosmos DB analytical storage. Avoid using the queries that join external data with system views, load external data in a temp table, or use some security or metadata functions to filter external data.
500
+
The error *The query references an object that is not supported in distributed processing mode* indicates that you have used for object or function that cannot be used while querying data in Azure storage or Cosmos DB analytical storage. Some objects (such as system views) and functions cannot be used while querying data stored in Azure data lake or Cosmos DB analytical storage. Avoid using the queries that join external data with system views, load external data in a temp table, or use some security or metadata functions to filter external data.
496
501
497
502
### `WaitIOCompletion` call failed
498
503
499
-
This message indicates that the query failed while waiting to complete IO operation that reads data from the remote storage (Azure Data Lake).
504
+
The error message*WaitIOCompletion call failed* indicates that the query failed while waiting to complete IO operation that reads data from the remote storage (Azure Data Lake).
500
505
Make sure that your storage is placed in the same region as serverless SQL pool, and that you are not using `archive access` storage that is paused by default. Check the storage metrics and verify that there are no other workloads on the storage layer (uploading new files) that could saturate IO requests.
501
506
502
507
### Incorrect syntax near 'NOT'
503
508
504
-
This error indicates that there are some external tables with the columns containing `NOT NULL` constraint in the column definition. Update the table to remove `NOT NULL` from the column definition. This error can sometimes also occur transiently with tables created from a CETAS statement. If the problem doesn't resolve, you can try dropping and recreating the external table.
509
+
The error*Incorrect syntax near 'NOT'* indicates that there are some external tables with the columns containing `NOT NULL` constraint in the column definition. Update the table to remove `NOT NULL` from the column definition. This error can sometimes also occur transiently with tables created from a CETAS statement. If the problem doesn't resolve, you can try dropping and recreating the external table.
505
510
506
511
### Inserting value to batch for column type DATETIME2 failed
507
512
508
-
The datetime value stored in Parquet/Delta Lake file cannot be represented as `DATETIME2` column. Inspect the minimum value in the file using spark and check are there some dates less than 0001-01-03. If you stored the files using the Spark 2.4, the date time values before are written using the Julain calendar that is not aligned with the Gregorian Proleptic calendar used in serverless SQL pools. There might be a 2-days difference between Julian calendar user to write the values in Parquet (in some Spark versions) and Gregorian Proleptic calendar used in serverless SQL pool, which might cause conversion to invalid (negative) date value.
513
+
The error *Inserting value to batch for column type DATETIME2 failed* indicates that the serverless pool cannot read the date values form the underlying files. The datetime value stored in Parquet/Delta Lake file cannot be represented as `DATETIME2` column. Inspect the minimum value in the file using spark and check are there some dates less than 0001-01-03. If you stored the files using the Spark 2.4, the date time values before are written using the Julain calendar that is not aligned with the Gregorian Proleptic calendar used in serverless SQL pools. There might be a 2-days difference between Julian calendar user to write the values in Parquet (in some Spark versions) and Gregorian Proleptic calendar used in serverless SQL pool, which might cause conversion to invalid (negative) date value.
509
514
510
515
Try to use Spark to update these values because they are treated as invalid date values in SQL. The following sample shows how to update the values that are out of SQL date ranges to `NULL` in Delta Lake:
511
516
@@ -532,7 +537,7 @@ Serverless pools enable you to use T-SQL to configure database objects. There ar
532
537
533
538
### Please create a master key in the database or open the master key in the session before performing this operation.
534
539
535
-
If your query fails with the error message 'Please create a master key in the database or open the master key in the session before performing this operation.', it means that your user database has no access to a master key in the moment.
540
+
If your query fails with the error message *Please create a master key in the database or open the master key in the session before performing this operation*, it means that your user database has no access to a master key in the moment.
536
541
537
542
Most likely, you just created a new user database and did not create a master key yet.
### CREATE STATEMENT is not supported in master database
549
554
550
-
If your query fails with the error message:
551
-
552
-
> 'Failed to execute query. Error: CREATE EXTERNAL TABLE/DATA SOURCE/DATABASE SCOPED CREDENTIAL/FILE FORMAT is not supported in master database.'
553
-
554
-
it means that master database in serverless SQL pool does not support creation of:
555
+
If your query fails with the error message `Failed to execute query. Error: CREATE EXTERNAL TABLE/DATA SOURCE/DATABASE SCOPED CREDENTIAL/FILE FORMAT is not supported in master database` it means that master database in serverless SQL pool does not support creation of:
555
556
- External tables
556
557
- External data sources
557
558
- Database scoped credentials
@@ -589,7 +590,7 @@ If you are getting an error while trying to create new Azure AD login or user in
589
590
590
591
## Cosmos DB
591
592
592
-
Serverless SQL pools enable you to query Cosmos DB analytical storage using the `OPENROWSET` function.
593
+
Serverless SQL pools enable you to query Cosmos DB analytical storage using the `OPENROWSET` function. Make sure that your Cosmos DB container has analytical storage. Make sure that you correctly specified account, database, and container name. also, make sure that you cosmos DB account key is valid - see [prerequisites](query-cosmos-db-analytical-store.md#prerequisites).
593
594
594
595
### Cannot query CosmosDB using the OPENROWSET function
595
596
@@ -763,7 +764,7 @@ Make sure that a user has permissions to access databases, [permissions to execu
763
764
764
765
### Cannot access Cosmos DB account
765
766
766
-
Make sure that your Cosmos DB container has analytical storage. Make sure that you correctly specified account, database, and container name. You must use read-only cosmos DB credential to access your analytical storage, so make sure that it did not expire.
767
+
You must use read-only Cosmos DB key to access your analytical storage, so make sure that it did not expire or that it is not re-generated.
767
768
768
769
If you are getting the [Resolving Cosmos DB path has failed](#resolving-cosmosdb-path-has-failed) error, make sure that you configured firewall.
0 commit comments