Skip to content

Commit f80a419

Browse files
authored
Merge pull request #110619 from CarlRabeler/contextualtoc
Contextualtoc - t-sql
2 parents b48c6da + 4361175 commit f80a419

File tree

46 files changed

+843
-796
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

46 files changed

+843
-796
lines changed

articles/synapse-analytics/data-integration/data-integration-data-lake.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ In this article, you'll learn how to ingest data from one location to another in
1818
## Prerequisites
1919

2020
* **Azure subscription**: If you don't have an Azure subscription, create a [free Azure account](https://azure.microsoft.com/free/) before you begin.
21-
* **Azure Storage account**: You use Azure Data Lake Gen 2 as a *source* data store. If you don't have a storage account, see [Create an Azure Storage account](../../storage/blobs/data-lake-storage-quickstart-create-account.md) for steps to create one.
21+
* **Azure Storage account**: You use Azure Data Lake Gen 2 as a *source* data store. If you don't have a storage account, see [Create an Azure Storage account](../../storage/blobs/data-lake-storage-quickstart-create-account.md?toc=/azure/synapse-analytics/toc.json&bc=/azure/synapse-analytics/breadcrumb/toc.json) for steps to create one.
2222

2323
## Create linked services
2424

articles/synapse-analytics/data-integration/data-integration-sql-pool.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,8 +18,8 @@ In this article you will learn how to ingest data from an Azure Data Lake Gen 2
1818
## Prerequisites
1919

2020
* **Azure subscription**: If you don't have an Azure subscription, create a [free Azure account](https://azure.microsoft.com/free/) before you begin.
21-
* **Azure storage account**: You use Azure Data Lake Storage Gen 2 as a *source* data store. If you don't have a storage account, see [Create an Azure Storage account](../../storage/blobs/data-lake-storage-quickstart-create-account.md) for steps to create one.
22-
* **Azure Synapse Analytics**: You use a SQL pool as a *sink* data store. If you don't have an Azure Synapse Analytics instance, see [Create a SQL pool](../../sql-database/sql-database-get-started-portal.md) for steps to create one.
21+
* **Azure storage account**: You use Azure Data Lake Storage Gen 2 as a *source* data store. If you don't have a storage account, see [Create an Azure Storage account](../../storage/blobs/data-lake-storage-quickstart-create-account.md?toc=/azure/synapse-analytics/toc.json&bc=/azure/synapse-analytics/breadcrumb/toc.json) for steps to create one.
22+
* **Azure Synapse Analytics**: You use a SQL pool as a *sink* data store. If you don't have an Azure Synapse Analytics instance, see [Create a SQL pool](../../sql-database/sql-database-get-started-portal.md?toc=/azure/synapse-analytics/toc.json&bc=/azure/synapse-analytics/breadcrumb/toc.json) for steps to create one.
2323

2424
## Create linked services
2525

articles/synapse-analytics/overview-faq.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -87,7 +87,7 @@ A: End-to-end single sign-on experience is an important authentication process i
8787

8888
### Q: How do I get access to files and folders in the ADLSg2
8989

90-
A: Access to files and folders is currently managed through ADLSg2. For more information, see [Data Lake storage access control](../storage/blobs/data-lake-storage-access-control.md).
90+
A: Access to files and folders is currently managed through ADLSg2. For more information, see [Data Lake storage access control](../storage/blobs/data-lake-storage-access-control.md?toc=/azure/synapse-analytics/toc.json&bc=/azure/synapse-analytics/breadcrumb/toc.json).
9191

9292
### Q: Can I use third-party business intelligence tools to access Azure Synapse Analytics
9393

articles/synapse-analytics/quickstart-create-workspace.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ If you don't have an Azure subscription, [create a free account before you begin
1919

2020
## Prerequisites
2121

22-
- [Azure Data Lake Storage Gen2 storage account](../storage/common/storage-account-create.md)
22+
- [Azure Data Lake Storage Gen2 storage account](../storage/common/storage-account-create.md?toc=/azure/synapse-analytics/toc.json&bc=/azure/synapse-analytics/breadcrumb/toc.json)
2323

2424
## Sign in to the Azure portal
2525

@@ -36,7 +36,7 @@ Sign in to the [Azure portal](https://portal.azure.com/)
3636
| Setting | Suggested value | Description |
3737
| :------ | :-------------- | :---------- |
3838
| **Subscription** | *Your subscription* | For details about your subscriptions, see [Subscriptions](https://account.windowsazure.com/Subscriptions). |
39-
| **Resource group** | *Any resource group* | For valid resource group names, see [Naming rules and restrictions](https://docs.microsoft.com/azure/architecture/best-practices/naming-conventions). |
39+
| **Resource group** | *Any resource group* | For valid resource group names, see [Naming rules and restrictions](/azure/architecture/best-practices/resource-naming.md?toc=/azure/synapse-analytics/toc.json&bc=/azure/synapse-analytics/breadcrumb/toc.json&view=azure-sqldw-latest). |
4040
| **Workspace name** | mysampleworkspace | Specifies the name of the workspace, which will also be used for connection endpoints.|
4141
| **Region** | East US2 | Specifies the location of the workspace.|
4242
| **Data Lake Storage Gen2** | Account: `storage account name` </br> File system: `root file system to use` | Specifies the ADLS Gen2 storage account name to use as primary storage and the file system to use.|

articles/synapse-analytics/quickstart-sql-on-demand.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -152,7 +152,7 @@ Files are stored in *json* container, folder *books*, and contain single book en
152152

153153
### Querying JSON files
154154

155-
Following query shows how to use [JSON_VALUE](https://docs.microsoft.com/sql/t-sql/functions/json-value-transact-sql?view=sql-server-2017) to retrieve scalar values (title, publisher) from a book with the title *Probabilistic and Statistical Methods in Cryptology, An Introduction by Selected articles*:
155+
Following query shows how to use [JSON_VALUE](/sql/t-sql/functions/json-value-transact-sql?toc=/azure/synapse-analytics/toc.json&bc=/azure/synapse-analytics/breadcrumb/toc.json&view=azure-sqldw-latest) to retrieve scalar values (title, publisher) from a book with the title *Probabilistic and Statistical Methods in Cryptology, An Introduction by Selected articles*:
156156

157157
```sql
158158
SELECT

articles/synapse-analytics/quickstart-synapse-studio.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ You can create new folders and upload files using the links in toolbar to organi
4444
## Query files on storage account
4545

4646
> [!IMPORTANT]
47-
> You need to be a member of the `Storage Blob Reader` role on the underlying storage in order to be able to query the files. Learn how to [assign **Storage Blob Data Reader** or **Storage Blob Data Contributor** RBAC permissions on Azure Storage](../storage/common/storage-auth-aad-rbac-portal.md#assign-a-built-in-rbac-role).
47+
> You need to be a member of the `Storage Blob Reader` role on the underlying storage in order to be able to query the files. Learn how to [assign **Storage Blob Data Reader** or **Storage Blob Data Contributor** RBAC permissions on Azure Storage](../storage/common/storage-auth-aad-rbac-portal.md?toc=/azure/synapse-analytics/toc.json&bc=/azure/synapse-analytics/breadcrumb/toc.json#assign-a-built-in-rbac-role).
4848
4949
1. Upload some `PARQUET` files.
5050
2. Select one or more files and then create a new SQL script or a Spark notebook to see the content of the files. If you want to create a notebook, you would need to create [Apache Spark pool in the workspace](spark/apache-spark-notebook-create-spark-use-sql.md).
@@ -59,7 +59,7 @@ You can create new folders and upload files using the links in toolbar to organi
5959

6060
## Next steps
6161

62-
- Enable AAD users to query files [by assigning **Storage Blob Data Reader** or **Storage Blob Data Contributor** RBAC permissions on Azure Storage](../storage/common/storage-auth-aad-rbac-portal.md#assign-a-built-in-rbac-role)
62+
- Enable AAD users to query files [by assigning **Storage Blob Data Reader** or **Storage Blob Data Contributor** RBAC permissions on Azure Storage](../storage/common/storage-auth-aad-rbac-portal.md?toc=/azure/synapse-analytics/toc.json&bc=/azure/synapse-analytics/breadcrumb/toc.json#assign-a-built-in-rbac-role)
6363
- [Query files on Azure Storage using SQL On-Demand](sql/on-demand.md)
6464
- [Create Apache Spark pool](spark/apache-spark-notebook-create-spark-use-sql.md)
6565
- [Create Power BI report on files stored on Azure Storage](sql/tutorial-connect-power-bi-desktop.md)

articles/synapse-analytics/security/how-to-set-up-access-control.md

Lines changed: 10 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -52,10 +52,10 @@ Create and populate three security groups for your workspace:
5252

5353
Identify this information about your storage:
5454

55-
- The ADLSGEN2 account to use for your workspace. This document calls it STG1. STG1 is considered the "primary" storage account for your workspace.
56-
- The container inside WS1 that your Synapse workspace will use by default. This document calls it CNT1. This container is used for:
57-
- Storing the backing data files for Spark tables
58-
- Execution logs for Spark jobs
55+
- The ADLSGEN2 account to use for your workspace. This document calls it STG1. STG1 is considered the "primary" storage account for your workspace.
56+
- The container inside WS1 that your Synapse workspace will use by default. This document calls it CNT1. This container is used for:
57+
- Storing the backing data files for Spark tables
58+
- Execution logs for Spark jobs
5959

6060
- Using the Azure portal, assign the security groups the following roles on CNT1
6161

@@ -67,9 +67,9 @@ Identify this information about your storage:
6767

6868
In the Azure portal, create a Synapse workspace:
6969

70-
- Name the workspace WS1
71-
- Choose STG1 for the Storage account
72-
- Choose CNT1 for the container that is being used as the "filesystem".
70+
- Name the workspace WS1
71+
- Choose STG1 for the Storage account
72+
- Choose CNT1 for the container that is being used as the "filesystem".
7373
- Open WS1 in Synapse Studio
7474
- Select **Manage** > **Access Control** assign the security groups to the following Synapse roles.
7575
- Assign **WS1\_WSAdmins** to Synapse Workspace admins
@@ -118,12 +118,13 @@ Users in each role need to complete the following steps:
118118
> [!NOTE]
119119
> [1] To create SQL or Spark pools the user must have at least Contributor role on the Synapse workspace.
120120
> [!TIP]
121-
> - Some steps will deliberately not be allowed depending on the role.
121+
>
122+
> - Some steps will deliberately not be allowed depending on the role.
122123
> - Keep in mind that some tasks may fail if the security was not fully configured. These tasks are noted in the table.
123124
124125
## STEP 8: Network Security
125126

126-
To configure the workspace firewall, virtual network, and Private Link see instructions here \&lt;\&lt;insert link\&gt;\&gt;.
127+
To configure the workspace firewall, virtual network, and [Private Link](../../sql-database/sql-database-private-endpoint-overview.md).
127128

128129
## STEP 9: Completion
129130

articles/synapse-analytics/sql-data-warehouse/cheat-sheet.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -121,6 +121,7 @@ Finally, by using Gen2 of [SQL pool](sql-data-warehouse-overview-what-is.md#syna
121121
Learn more how to work with [resource classes and concurrency](resource-classes-for-workload-management.md).
122122

123123
## Lower your cost
124+
124125
A key feature of Azure Synapse is the ability to [manage compute resources](sql-data-warehouse-manage-compute-overview.md). You can pause SQL pool when you're not using it, which stops the billing of compute resources. You can scale resources to meet your performance demands. To pause, use the [Azure portal](pause-and-resume-compute-portal.md) or [PowerShell](pause-and-resume-compute-powershell.md). To scale, use the [Azure portal](quickstart-scale-compute-portal.md), [Powershell](quickstart-scale-compute-powershell.md), [T-SQL](quickstart-scale-compute-tsql.md), or a [REST API](sql-data-warehouse-manage-compute-rest-api.md#scale-compute).
125126

126127
Autoscale now at the time you want with Azure Functions:

articles/synapse-analytics/sql-data-warehouse/sql-data-warehouse-develop-transactions.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ In the following table, two assumptions have been made:
3939

4040
## Gen2
4141

42-
| [DWU](../../sql-data-warehouse/sql-data-warehouse-overview-what-is.md) | Cap per distribution (GB) | Number of Distributions | MAX transaction size (GB) | # Rows per distribution | Max Rows per transaction |
42+
| [DWU](../../sql-data-warehouse/sql-data-warehouse-overview-what-is.md?toc=/azure/synapse-analytics/toc.json&bc=/azure/synapse-analytics/breadcrumb/toc.json) | Cap per distribution (GB) | Number of Distributions | MAX transaction size (GB) | # Rows per distribution | Max Rows per transaction |
4343
| --- | --- | --- | --- | --- | --- |
4444
| DW100c |1 |60 |60 |4,000,000 |240,000,000 |
4545
| DW200c |1.5 |60 |90 |6,000,000 |360,000,000 |
@@ -60,7 +60,7 @@ In the following table, two assumptions have been made:
6060

6161
## Gen1
6262

63-
| [DWU](../../sql-data-warehouse/sql-data-warehouse-overview-what-is.md) | Cap per distribution (GB) | Number of Distributions | MAX transaction size (GB) | # Rows per distribution | Max Rows per transaction |
63+
| [DWU](../../sql-data-warehouse/sql-data-warehouse-overview-what-is.md?toc=/azure/synapse-analytics/toc.json&bc=/azure/synapse-analytics/breadcrumb/toc.json) | Cap per distribution (GB) | Number of Distributions | MAX transaction size (GB) | # Rows per distribution | Max Rows per transaction |
6464
| --- | --- | --- | --- | --- | --- |
6565
| DW100 |1 |60 |60 |4,000,000 |240,000,000 |
6666
| DW200 |1.5 |60 |90 |6,000,000 |360,000,000 |

articles/synapse-analytics/sql-data-warehouse/sql-data-warehouse-load-from-azure-blob-storage-with-polybase.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -216,7 +216,7 @@ GO
216216

217217
### Load the data into new tables
218218

219-
To load data from Azure blob storage into the data warehouse table, use the [CREATE TABLE AS SELECT (Transact-SQL)](/sql/t-sql/statements/create-table-as-select-azure-sql-data-warehouse?toc=/azure/synapse-analytics/sql-data-warehouse/toc.json&bc=/azure/synapse-analytics/sql-data-warehouse/breadcrumb/toc.json&view=azure-sqldw-latest) statement. Loading with [CTAS](../sql-data-warehouse/sql-data-warehouse-develop-ctas.md) leverages the strongly typed external tables you've created. To load the data into new tables, use one CTAS statement per table.
219+
To load data from Azure blob storage into the data warehouse table, use the [CREATE TABLE AS SELECT (Transact-SQL)](/sql/t-sql/statements/create-table-as-select-azure-sql-data-warehouse?toc=/azure/synapse-analytics/sql-data-warehouse/toc.json&bc=/azure/synapse-analytics/sql-data-warehouse/breadcrumb/toc.json&view=azure-sqldw-latest) statement. Loading with [CTAS](../sql-data-warehouse/sql-data-warehouse-develop-ctas.md?toc=/azure/synapse-analytics/toc.json&bc=/azure/synapse-analytics/breadcrumb/toc.json) leverages the strongly typed external tables you've created. To load the data into new tables, use one CTAS statement per table.
220220

221221
CTAS creates a new table and populates it with the results of a select statement. CTAS defines the new table to have the same columns and data types as the results of the select statement. If you select all the columns from an external table, the new table will be a replica of the columns and data types in the external table.
222222

0 commit comments

Comments
 (0)