Skip to content

Commit 7d6a978

Browse files
committed
minor
1 parent 014303d commit 7d6a978

File tree

2 files changed

+6
-8
lines changed

2 files changed

+6
-8
lines changed

articles/synapse-analytics/synapse-link/how-to-query-analytical-store-spark.md

Lines changed: 5 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -13,13 +13,11 @@ ms.reviewer: jrasnick
1313

1414
# Query Cosmos DB analytical with Synapse Spark
1515

16-
This article gives some examples on how you can interact with the analytical store from Synapse gestures. Those gestures are visible when you right-click on a container.
17-
18-
When you right click into a container, Synapse will be able to infer which linked service, database and container it refers to. Such gestures are very simple to get quickly code and tweak it to your needs but they are also perfect for discovering data in a single click.
16+
This article gives some examples on how you can interact with the analytical store from Synapse gestures. Those gestures are visible when you right-click on a container. With gestures, you can quickly generate code and tweak it to your needs. They are also perfect for discovering data with a single click.
1917

2018
## Load to DataFrame
2119

22-
In this step, you will read from Azure Cosmos DB analytical store into a Spark DataFrame and display 10 rows from the DataFrame called df. Once your data is into dataframe, you can perform additional analysis. This operation does not impact the transactional store.
20+
In this step, you will read data from Azure Cosmos DB analytical store in a Spark DataFrame. It will display 10 rows from the DataFrame called ***df***. Once your data is into dataframe, you can perform additional analysis. This operation does not impact the transactional store.
2321

2422
```python
2523
# To select a preferred list of regions in a multi-region Cosmos DB account, add .option("spark.cosmos.preferredRegions", "<Region1>,<Region2>")
@@ -34,7 +32,7 @@ df = spark.read.format("cosmos.olap")\
3432

3533
## Create Spark table
3634

37-
In this gesture, you will create a Spark table pointing to the container you selected. That operation does not incur any data movement. If you decide to delete that table, the underlying container (and corresponding analytical store) won't be impacted. This scenario is very convenient to reuse tables through 3rd party tools and provide accessibility to the data for the run-time.
35+
In this gesture, you will create a Spark table pointing to the container you selected. That operation does not incur any data movement. If you decide to delete that table, the underlying container (and corresponding analytical store) won't be impacted. This scenario is convenient to reuse tables through third-party tools and provide accessibility to the data for the run-time.
3836

3937
```sql
4038
%%sql
@@ -47,7 +45,7 @@ create table call_center using cosmos.olap options (
4745
```
4846

4947
## Write DataFrame to container
50-
In this gesture, you will write back a dataframe into a container. This operation will impact the transactional performance and consume Request Units. Using Azure Cosmos DB transactional performance will optimize the speed and reliability of those write transactions. Make sure that you replace **YOURDATAFRAME** by the dataframe that you want to write back.
48+
In this gesture, you will write a dataframe into a container. This operation will impact the transactional performance and consume Request Units. Using Azure Cosmos DB transactional performance is ideal for write transactions. Make sure that you replace **YOURDATAFRAME** by the dataframe that you want to write back.
5149

5250
```python
5351
# Write a Spark DataFrame into a Cosmos DB container
@@ -63,7 +61,7 @@ YOURDATAFRAME.write.format("cosmos.oltp")\
6361
```
6462

6563
## Load streaming DataFrame from container
66-
In this gesture, you will use Spark Streaming capability with change feed support to load data from a container into a dataframe with data being stored into the primary data lake account that you connected to the workspace. If the folder /localReadCheckpointFolder is not created, it will be automatically created. This operation will impact the transactional performance of Cosmos DB.
64+
In this gesture, you will use Spark Streaming capability to load data from a container into a dataframe. The data will be stored into the primary data lake account (and file system) that you connected to the workspace. If the folder /localReadCheckpointFolder is not created, it will be automatically created. This operation will impact the transactional performance of Cosmos DB.
6765

6866
```python
6967
# To select a preferred list of regions in a multi-region Cosmos DB account, add .option("spark.cosmos.preferredRegions", "<Region1>,<Region2>")

articles/synapse-analytics/toc.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -78,7 +78,7 @@
7878
href: ../data-factory/concepts-data-flow-overview.md?toc=/azure/synapse-analytics/toc.json&bc=/azure/synapse-analytics/breadcrumb/toc.json
7979
- name: Maintenance schedule
8080
href: ./sql-data-warehouse/maintenance-scheduling.md?toc=/azure/synapse-analytics/toc.json&bc=/azure/synapse-analytics/breadcrumb/toc.json
81-
- name: Backup and restore
81+
- name: Back up and restore
8282
href: ./sql-data-warehouse/backup-and-restore.md?toc=/azure/synapse-analytics/toc.json&bc=/azure/synapse-analytics/breadcrumb/toc.json
8383
- name: Monitoring
8484
items:

0 commit comments

Comments
 (0)