Skip to content

Commit 8972994

Browse files
authored
Merge pull request #115308 from Kat-Campise/sql_articles_1
sql articles 1
2 parents 454217b + 0fcab30 commit 8972994

File tree

2 files changed

+13
-9
lines changed

2 files changed

+13
-9
lines changed

articles/synapse-analytics/sql/best-practices-sql-on-demand.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -47,23 +47,23 @@ If possible, you can prepare files for better performance:
4747

4848
## Push wildcards to lower levels in path
4949

50-
You can use wildcards in your path to [query multiple files and folders](develop-storage-files-overview.md#query-multiple-files-or-folders). SQL on-demand lists files in your storage account starting from first * using storage API and eliminates files that do not match specified path. Reducing initial list of files can improve performance if there are many files that match specified path up to first wildcard.
50+
You can use wildcards in your path to [query multiple files and folders](develop-storage-files-overview.md#query-multiple-files-or-folders). SQL on-demand lists files in your storage account starting from first * using storage API and eliminates files that don't match specified path. Reducing initial list of files can improve performance if there are many files that match specified path up to first wildcard.
5151

5252
## Use appropriate data types
5353

54-
Data types used in your query affects performance. You can get better performance if you:
54+
The data types you use in your query impact performance. You can get better performance if you:
5555

5656
- Use the smallest data size that will accommodate the largest possible value.
5757
- If maximum character value length is 30 characters, use character data type of length 30.
5858
- If all character column values are of fixed size, use char or nchar. Otherwise, use varchar or nvarchar.
5959
- If maximum integer column value is 500, use smallint as it is smallest data type that can accommodate this value. You can find integer data type ranges [here](https://docs.microsoft.com/sql/t-sql/data-types/int-bigint-smallint-and-tinyint-transact-sql?view=sql-server-ver15).
6060
- If possible, use varchar and char instead of nvarchar and nchar.
61-
- Use integer-based data types if possible. Sort, join and group by operations are performed faster on integers than on characters data.
62-
- If you are using schema inference, [check inferred data type](#check-inferred-data-types).
61+
- Use integer-based data types if possible. Sort, join, and group by operations are performed faster on integers than on characters data.
62+
- If you're using schema inference, [check inferred data type](#check-inferred-data-types).
6363

6464
## Check inferred data types
6565

66-
[Schema inference](query-parquet-files.md#automatic-schema-inference) helps you quickly write queries and explore data without knowing file schema. This comfort comes at expense of inferred data types being larger than they actually are. It happens when there is not enough information in source files to make sure appropriate data type is used. For example, Parquet files do not contain metadata about maximum character column length and SQL on-demand infers it as varchar(8000).
66+
[Schema inference](query-parquet-files.md#automatic-schema-inference) helps you quickly write queries and explore data without knowing file schema. This comfort comes at the expense of inferred data types being larger than they actually are. It happens when there isn't enough information in source files to make sure appropriate data type is used. For example, Parquet files don't contain metadata about maximum character column length and SQL on-demand infers it as varchar(8000).
6767

6868
You can check resulting data types of your query using [sp_describe_first_results_set](https://docs.microsoft.com/sql/relational-databases/system-stored-procedures/sp-describe-first-result-set-transact-sql?view=sql-server-ver15).
6969

@@ -87,7 +87,7 @@ Here is the result set.
8787
|0|2|pickup_datetime|datetime2(7)|8|
8888
|0|3|passenger_count|int|4|
8989

90-
Once we know inferred data types for query we can specify appropriate data types:
90+
Once we know inferred data types for query, we can specify appropriate data types:
9191

9292
```sql
9393
SELECT
@@ -138,4 +138,4 @@ If you need better performance, try SAS credentials to access storage until AAD
138138

139139
## Next steps
140140

141-
Review the [Troubleshooting](../sql-data-warehouse/sql-data-warehouse-troubleshoot.md?toc=/azure/synapse-analytics/toc.json&bc=/azure/synapse-analytics/breadcrumb/toc.json) article for common issues and solutions. If you're working with SQL pool rather than SQL on-demand, please see the [Best Practices for SQL pool](best-practices-sql-pool.md) article for specific guidance.
141+
Review the [Troubleshooting](../sql-data-warehouse/sql-data-warehouse-troubleshoot.md?toc=/azure/synapse-analytics/toc.json&bc=/azure/synapse-analytics/breadcrumb/toc.json) article for common issues and solutions. If you're working with SQL pool rather than SQL on-demand, see the [Best Practices for SQL pool](best-practices-sql-pool.md) article for specific guidance.

articles/synapse-analytics/sql/create-use-external-tables.md

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -56,7 +56,11 @@ The queries in this article will be executed on your sample database and use the
5656

5757
## Create an external table on protected data
5858

59-
You can create external tables that access data on an Azure storage account that allows access to users with some Azure AD identity or SAS key. You can create external tables the same way you create regular SQL Server external tables. The query below creates an external table that reads *population.csv* file from SynapseSQL demo Azure storage account that is referenced using `sqlondemanddemo` data source and protected with database scoped credential called `sqlondemand`. Data source and database scoped credential are created in [setup script](https://github.com/Azure-Samples/Synapse/blob/master/SQL/Samples/LdwSample/SampleDB.sql).
59+
You can create external tables that access data on an Azure storage account that allows access to users with some Azure AD identity or SAS key. You can create external tables the same way you create regular SQL Server external tables.
60+
61+
The following query creates an external table that reads *population.csv* file from SynapseSQL demo Azure storage account that is referenced using `sqlondemanddemo` data source and protected with database scoped credential called `sqlondemand`.
62+
63+
Data source and database scoped credential are created in [setup script](https://github.com/Azure-Samples/Synapse/blob/master/SQL/Samples/LdwSample/SampleDB.sql).
6064

6165
> [!NOTE]
6266
> Change the first line in the query, i.e., [mydbname], so you're using the database you created.
@@ -99,7 +103,7 @@ CREATE EXTERNAL TABLE Taxi (
99103
FILE_FORMAT = ParquetFormat
100104
);
101105
```
102-
## Use a external table
106+
## Use an external table
103107
104108
You can use [external tables](develop-tables-external-tables.md) in your queries the same way you use them in SQL Server queries.
105109

0 commit comments

Comments
 (0)