Skip to content

Commit a62a877

Browse files
authored
Merge pull request #303613 from whhender/suggestion-resolution-july-2025
Suggestion resolution july 2025
2 parents 6109ffb + 5c3fd2d commit a62a877

14 files changed

+67
-67
lines changed

articles/data-factory/transform-data-databricks-jar.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -121,13 +121,13 @@ For more information, see the [Databricks documentation](/azure/databricks/dev-t
121121

122122
1. [Use the Databricks workspace UI](/azure/databricks/libraries/cluster-libraries#install-a-library-on-a-cluster)
123123

124-
2. To obtain the dbfs path of the library added using UI, you can use [Databricks CLI](/azure/databricks/dev-tools/cli/fs-commands#list-the-contents-of-a-directory).
124+
2. To obtain the dbfs path of the library added using UI, you can use [Databricks CLI](/azure/databricks/dev-tools/cli/reference/fs-commands#databricks-fs-ls).
125125

126126
Typically the Jar libraries are stored under dbfs:/FileStore/jars while using the UI. You can list all through the CLI: *databricks fs ls dbfs:/FileStore/job-jars*
127127

128128
### Or you can use the Databricks CLI:
129129

130-
1. Follow [Copy the library using Databricks CLI](/azure/databricks/dev-tools/cli/fs-commands#copy-a-directory-or-a-file)
130+
1. Follow [Copy the library using Databricks CLI](/azure/databricks/dev-tools/cli/reference/fs-commands#databricks-fs-cp)
131131

132132
2. Use Databricks CLI [(installation steps)](/azure/databricks/dev-tools/cli/commands#compute-commands)
133133

articles/data-factory/transform-data-databricks-notebook.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -140,13 +140,13 @@ In certain cases, you might require to pass back certain values from notebook ba
140140

141141
1. [Use the Databricks workspace UI](/azure/databricks/libraries/cluster-libraries#install-a-library-on-a-cluster)
142142

143-
2. To obtain the dbfs path of the library added using UI, you can use [Databricks CLI](/azure/databricks/dev-tools/cli/fs-commands#list-the-contents-of-a-directory).
143+
2. To obtain the dbfs path of the library added using UI, you can use [Databricks CLI](/azure/databricks/dev-tools/cli/reference/fs-commands#databricks-fs-ls).
144144

145145
Typically the Jar libraries are stored under dbfs:/FileStore/jars while using the UI. You can list all through the CLI: *databricks fs ls dbfs:/FileStore/job-jars*
146146

147147
### Or you can use the Databricks CLI:
148148

149-
1. Follow [Copy the library using Databricks CLI](/azure/databricks/dev-tools/cli/fs-commands#copy-a-directory-or-a-file)
149+
1. Follow [Copy the library using Databricks CLI](/azure/databricks/dev-tools/cli/reference/fs-commands#databricks-fs-cp)
150150

151151
2. Use Databricks CLI [(installation steps)](/azure/databricks/dev-tools/cli/commands#compute-commands)
152152

articles/synapse-analytics/how-to-analyze-complex-schema.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,12 @@
11
---
22
title: Analyze schema with arrays and nested structures
33
description: How to analyze arrays and nested structures with Apache Spark and SQL
4-
author: Rodrigossz
4+
author: im-microsoft
55
ms.service: azure-synapse-analytics
66
ms.topic: how-to
77
ms.subservice: spark
88
ms.date: 06/15/2020
9-
ms.author: rosouz
9+
ms.author: imotiwala
1010

1111
---
1212

articles/synapse-analytics/quickstart-connect-synapse-link-cosmos-db.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,12 @@
11
---
22
title: 'Quickstart: Connect to Azure Synapse Link for Azure Cosmos DB'
33
description: How to connect an Azure Cosmos DB to a Synapse workspace with Synapse Link
4-
author: Rodrigossz
4+
author: im-microsoft
55
ms.service: azure-synapse-analytics
66
ms.subservice: synapse-link
77
ms.topic: quickstart
88
ms.date: 04/21/2020
9-
ms.author: rosouz
9+
ms.author: imotiwala
1010
ms.custom: cosmos-db, mode-other
1111
---
1212

articles/synapse-analytics/synapse-link/concept-synapse-link-cosmos-db-support.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,12 @@
11
---
22
title: Azure Synapse Link for Azure Cosmos DB supported features
33
description: Understand the current list of actions supported by Azure Synapse Link for Azure Cosmos DB
4-
author: Rodrigossz
4+
author: im-microsoft
55
ms.service: azure-synapse-analytics
66
ms.topic: conceptual
77
ms.subservice: synapse-link
88
ms.date: 06/02/2021
9-
ms.author: rosouz
9+
ms.author: imotiwala
1010
ms.custom: cosmos-db
1111
---
1212

articles/synapse-analytics/synapse-link/connect-synapse-link-sql-database.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
---
22
title: Get started with Azure Synapse Link for Azure SQL Database
33
description: Learn how to connect an Azure SQL database to an Azure Synapse workspace with Azure Synapse Link.
4-
author: Rodrigossz
5-
ms.author: rosouz
4+
author: im-microsoft
5+
ms.author: imotiwala
66
ms.date: 08/06/2024
77
ms.service: azure-synapse-analytics
88
ms.subservice: synapse-link

articles/synapse-analytics/synapse-link/connect-synapse-link-sql-server-2022.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,13 @@
11
---
22
title: Create Azure Synapse Link for SQL Server 2022
33
description: Learn how to create and connect a SQL Server 2022 instance to an Azure Synapse workspace by using Azure Synapse Link.
4-
author: Rodrigossz
4+
author: im-microsoft
55
ms.service: azure-synapse-analytics
66
ms.topic: how-to
77
ms.subservice: synapse-link
88
ms.custom: engagement-fy23
99
ms.date: 07/30/2024
10-
ms.author: rosouz
10+
ms.author: imotiwala
1111
---
1212

1313
# Get started with Azure Synapse Link for SQL Server 2022

articles/synapse-analytics/synapse-link/how-to-connect-synapse-link-cosmos-db.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,12 @@
11
---
22
title: Connect to Azure Synapse Link for Azure Cosmos DB
33
description: Learn how to connect an Azure Cosmos DB database to an Azure Synapse workspace with Azure Synapse Link.
4-
author: Rodrigossz
4+
author: im-microsoft
55
ms.service: azure-synapse-analytics
66
ms.topic: quickstart
77
ms.subservice: synapse-link
88
ms.date: 03/02/2021
9-
ms.author: rosouz
9+
ms.author: imotiwala
1010
ms.custom: cosmos-db, mode-other
1111
---
1212

articles/synapse-analytics/synapse-link/how-to-copy-to-sql-pool.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,12 @@
11
---
22
title: Copy Azure Synapse Link for Azure Cosmos DB data into a dedicated SQL pool using Apache Spark
33
description: Load the data into a Spark dataframe, curate the data, and load it into a dedicated SQL pool table
4-
author: Rodrigossz
4+
author: im-microsoft
55
ms.service: azure-synapse-analytics
66
ms.topic: quickstart
77
ms.subservice: synapse-link
88
ms.date: 08/10/2020
9-
ms.author: rosouz
9+
ms.author: imotiwala
1010
ms.reviewer: sidandrews
1111
ms.custom: cosmos-db, mode-other
1212
---

articles/synapse-analytics/synapse-link/how-to-query-analytical-store-spark-3.md

Lines changed: 14 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,12 @@
11
---
22
title: Interact with Azure Cosmos DB using Apache Spark 3 in Azure Synapse Link
33
description: How to interact with Azure Cosmos DB using Apache Spark 3 in Azure Synapse Link
4-
author: Rodrigossz
4+
author: im-microsoft
55
ms.service: azure-synapse-analytics
66
ms.topic: quickstart
77
ms.subservice: synapse-link
88
ms.date: 03/04/2025
9-
ms.author: rosouz
9+
ms.author: imotiwala
1010
ms.custom: cosmos-db, mode-other
1111
---
1212

@@ -18,24 +18,24 @@ The following capabilities are supported while interacting with Azure Cosmos DB:
1818
* Synapse Apache Spark 3 allows you to analyze data in your Azure Cosmos DB containers that are enabled with Azure Synapse Link in near real-time without impacting the performance of your transactional workloads. The following two options are available to query the Azure Cosmos DB [analytical store](/azure/cosmos-db/analytical-store-introduction) from Spark:
1919
+ Load to Spark DataFrame
2020
+ Create Spark table
21-
* Synapse Apache Spark also allows you to ingest data into Azure Cosmos DB. It is important to note that data is always ingested into Azure Cosmos DB containers through the transactional store. When Synapse Link is enabled, any new inserts, updates, and deletes are then automatically synced to the analytical store.
21+
* Synapse Apache Spark also allows you to ingest data into Azure Cosmos DB. It's important to note that data is always ingested into Azure Cosmos DB containers through the transactional store. When Azure Synapse Link is enabled, any new inserts, updates, and deletes are then automatically synced to the analytical store.
2222
* Synapse Apache Spark also supports Spark structured streaming with Azure Cosmos DB as a source and a sink.
2323

24-
The following sections walk you through the syntax. You can also checkout the Learn module on how to [Query Azure Cosmos DB with Apache Spark for Azure Synapse Analytics](/training/modules/query-azure-cosmos-db-with-apache-spark-for-azure-synapse-analytics/). Gestures in Azure Synapse Analytics workspace are designed to provide an easy out-of-the-box experience to get started. Gestures are visible when you right-click on an Azure Cosmos DB container in the **Data** tab of the Synapse workspace. With gestures, you can quickly generate code and tailor it to your needs. Gestures are also perfect for discovering data with a single click.
24+
The following sections walk you through the syntax. You can also check out the Learn module on how to [Query Azure Cosmos DB with Apache Spark for Azure Synapse Analytics](/training/modules/query-azure-cosmos-db-with-apache-spark-for-azure-synapse-analytics/). Gestures in Azure Synapse Analytics workspace are designed to provide an easy out-of-the-box experience to get started. Gestures are visible when you right-click on an Azure Cosmos DB container in the **Data** tab of the Synapse workspace. With gestures, you can quickly generate code and tailor it to your needs. Gestures are also perfect for discovering data with a single click.
2525

2626
> [!IMPORTANT]
2727
> You should be aware of some constraints in the analytical schema that could lead to the unexpected behavior in data loading operations.
28-
> As an example, only first 1,000 properties from transactional schema are available in the analytical schema, properties with spaces are not available, etc. If you are experiencing some unexpected results, check the [analytical store schema constraints](/azure/cosmos-db/analytical-store-introduction#schema-constraints) for more details.
28+
> As an example, only first 1,000 properties from transactional schema are available in the analytical schema, properties with spaces aren't available, etc. If you're experiencing some unexpected results, check the [analytical store schema constraints](/azure/cosmos-db/analytical-store-introduction#schema-constraints) for more details.
2929
3030
## Query Azure Cosmos DB analytical store
3131

3232
Customers can load analytical store data to Spark DataFrames or create Spark tables.
3333

34-
The difference in experience is around whether underlying data changes in the Azure Cosmos DB container should be automatically reflected in the analysis performed in Spark. When Spark DataFrames are registered, or a Spark table is created, Spark fetches analytical store metadata for efficient pushdown. It is important to note that since Spark follows a lazy evaluation policy. You need to take action to fecth the last snapshot of the data in Spark DataFrames or SparkSQL queries.
34+
The difference in experience is around whether underlying data changes in the Azure Cosmos DB container should be automatically reflected in the analysis performed in Spark. When Spark DataFrames are registered, or a Spark table is created, Spark fetches analytical store metadata for efficient pushdown. It's important to note that since Spark follows a lazy evaluation policy. You need to take action to fetch the last snapshot of the data in Spark DataFrames or SparkSQL queries.
3535

3636
In the case of **loading to Spark DataFrame**, the fetched metadata is cached through the lifetime of the Spark session and hence subsequent actions invoked on the DataFrame are evaluated against the snapshot of the analytical store at the time of DataFrame creation.
3737

38-
On the other hand, in the case of **creating a Spark table**, the metadata of the analytical store state is not cached in Spark and is reloaded on every SparkSQL query execution against the Spark table.
38+
On the other hand, in the case of **creating a Spark table**, the metadata of the analytical store state isn't cached in Spark and is reloaded on every SparkSQL query execution against the Spark table.
3939

4040
To conclude, you can choose between loading a snapshot to Spark DataFrame or querying a Spark table for the latest snapshot.
4141

@@ -47,7 +47,7 @@ To conclude, you can choose between loading a snapshot to Spark DataFrame or que
4747
4848
## Authentication
4949

50-
Now Spark 3.x customers can authenticate to Azure Cosmos DB analytical store using trusted identities access tokens or database account keys. Tokens are more secure as they are short lived, and assigned to the required permission using Cosmos DB RBAC.
50+
Now Spark 3.x customers can authenticate to Azure Cosmos DB analytical store using trusted identities access tokens or database account keys. Tokens are more secure as they're short lived, and assigned to the required permission using Cosmos DB RBAC.
5151

5252
The connector now supports two auth types, `MasterKey` and `AccessToken` for the `spark.cosmos.auth.type` property.
5353

@@ -85,13 +85,13 @@ df.show(10)
8585
```
8686

8787
> [!NOTE]
88-
> Azure Cosmos DB's Synapse Link Spark connector does not support Managed Identity.
88+
> Azure Cosmos DB's Azure Synapse Link Spark connector doesn't support Managed Identity.
8989
9090
#### Access token authentication requires role assignment
9191

9292
To use the access token approach, you need to generate access tokens. Since access tokens are associated with Azure identities, correct role-based access control (RBAC) must be assigned to the identity. The role assignment is on data plane level, and you must have minimum control plane permissions to perform the role assignment.
9393

94-
The Identity Access Management (IAM) role assignments from Azure portal are on control plane level and don't affect the role assignments on data plane. Data plane role assignments are only available via Azure CLI. The `readAnalytics` action is required to read data from analytical store in Cosmos DB and is not part of any predefined roles. As such we must create a custom role definition. In addition to the `readAnalytics` action, also add the actions required for Data Reader. Create a JSON file with the following content and name it role_definition.json
94+
The Identity Access Management (IAM) role assignments from Azure portal are on control plane level and don't affect the role assignments on data plane. Data plane role assignments are only available via Azure CLI. The `readAnalytics` action is required to read data from analytical store in Cosmos DB and isn't part of any predefined roles. As such we must create a custom role definition. In addition to the `readAnalytics` action, also add the actions required for Data Reader. Create a JSON file with the following content and name it role_definition.json
9595

9696
```JSON
9797
{
@@ -113,7 +113,7 @@ The Identity Access Management (IAM) role assignments from Azure portal are on c
113113
#### Access Token authentication requires Azure CLI
114114

115115
- Log into Azure CLI: `az login`
116-
- Set the default subscription which has your Cosmos DB account: `az account set --subscription <name or id>`
116+
- Set the default subscription, which has your Cosmos DB account: `az account set --subscription <name or id>`
117117
- Create the role definition in the desired Cosmos DB account: `az cosmosdb sql role definition create --account-name <cosmos-account-name> --resource-group <resource-group-name> --body @role_definition.json`
118118
- Copy over the role `definition id` returned: `/subscriptions/<subscription-id>/resourceGroups/<resource-group-name>/providers/Microsoft.DocumentDB/databaseAccounts/< cosmos-account-name >/sqlRoleDefinitions/<a-random-generated-guid>`
119119
- Get the principal ID of the identity that you want to assign the role to. The identity could be an Azure app registration, a virtual machine, or any other supported Azure resource. Assign the role to the principal using: `az cosmosdb sql role assignment create --account-name "<cosmos-account-name>" --resource-group "<resource-group>" --scope "/" --principal-id "<principal-id-of-identity>" --role-definition-id "<role-definition-id-from-previous-step>"`
@@ -171,7 +171,7 @@ val df_olap = spark.read.format("cosmos.olap").
171171

172172
### Create Spark table
173173

174-
In this example, you create a Spark table that points the Azure Cosmos DB analytical store. You can then perform additional analysis by invoking SparkSQL queries against the table. This operation doesn't impact transactional store or incur data movement. If you decide to delete this Spark table, the underlying Azure Cosmos DB container and the corresponding analytical store will not be affected.
174+
In this example, you create a Spark table that points the Azure Cosmos DB analytical store. You can then perform more analysis by invoking SparkSQL queries against the table. This operation doesn't impact transactional store or incur data movement. If you decide to delete this Spark table, the underlying Azure Cosmos DB container and the corresponding analytical store won't be affected.
175175

176176
This scenario is convenient to reuse Spark tables through third-party tools and provide accessibility to the underlying data for the run-time.
177177

@@ -222,7 +222,7 @@ df.write.format("cosmos.oltp").
222222
## Load streaming DataFrame from container
223223
In this gesture, you use Spark Streaming capability to load data from a container into a dataframe. The data is stored in the primary data lake account (and file system) you connected to the workspace.
224224
> [!NOTE]
225-
> If you are looking to reference external libraries in Synapse Apache Spark, learn more [here](../spark/apache-spark-azure-portal-add-libraries.md). For instance, if you are looking to ingest a Spark DataFrame to a container of Azure Cosmos DB for MongoDB, you can use the MongoDB connector for Spark [here](https://docs.mongodb.com/spark-connector/master/).
225+
> If you're looking to reference external libraries in Synapse Apache Spark, learn more [here](../spark/apache-spark-azure-portal-add-libraries.md). For instance, if you're looking to ingest a Spark DataFrame to a container of Azure Cosmos DB for MongoDB, you can use the MongoDB connector for Spark [here](https://docs.mongodb.com/spark-connector/master/).
226226
227227
## Load streaming DataFrame from Azure Cosmos DB container
228228
In this example, you use Spark's structured streaming to load data from an Azure Cosmos DB container into a Spark streaming DataFrame, using the change feed functionality in Azure Cosmos DB. The checkpoint data used by Spark will be stored in the primary data lake account (and file system) that you connected to the workspace.
@@ -294,5 +294,5 @@ query.awaitTermination()
294294

295295
* [Samples to get started with Azure Synapse Link on GitHub](https://aka.ms/cosmosdb-synapselink-samples)
296296
* [Learn what is supported in Azure Synapse Link for Azure Cosmos DB](./concept-synapse-link-cosmos-db-support.md)
297-
* [Connect to Synapse Link for Azure Cosmos DB](../quickstart-connect-synapse-link-cosmos-db.md)
297+
* [Connect to Azure Synapse Link for Azure Cosmos DB](../quickstart-connect-synapse-link-cosmos-db.md)
298298
* Check out the Learn module on how to [Query Azure Cosmos DB with Apache Spark for Azure Synapse Analytics](/training/modules/query-azure-cosmos-db-with-apache-spark-for-azure-synapse-analytics/).

0 commit comments

Comments
 (0)