Skip to content

Commit 989fa77

Browse files
committed
Tweak wording/format
1 parent 83405c4 commit 989fa77

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

articles/purview/register-scan-snowflake.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -25,8 +25,8 @@ This article outlines how to register Snowflake, and how to authenticate and int
2525

2626
When scanning Snowflake, Purview supports:
2727

28-
- Extract metadata including Snowflake servers, databases, schemas, tables, views, stored procedures, functions, pipes, stages, streams, tasks, sequences, and table/view/stream columns.
29-
- Fetch static lineage on assets relationships among tables, views and streams.
28+
- Extract metadata including Snowflake server, databases, schemas, tables, views, stored procedures, functions, pipes, stages, streams, tasks, sequences, and table/view/stream columns.
29+
- Fetch static lineage on assets relationships among tables, views, and streams.
3030

3131
## Prerequisites
3232

@@ -61,7 +61,7 @@ On the **Register sources (Snowflake)** screen, do the following:
6161

6262
1. Enter a **Name** that the data source will be listed within the Catalog.
6363

64-
1. Enter the **server** URL used to connect to the Snowflake account, e.g. `xy12345.east-us-2.azure.snowflakecomputing.com`.
64+
1. Enter the **server** URL used to connect to the Snowflake account, for example, `xy12345.east-us-2.azure.snowflakecomputing.com`.
6565

6666
1. Select a collection or create a new one (Optional)
6767

@@ -115,7 +115,7 @@ To create and run a new scan, do the following:
115115

116116
Usage of NOT and special characters are not acceptable.
117117

118-
1. **Maximum memory available**: Maximum memory (in GB) available on customer's VM to be used by scanning processes. This is dependent on the size of Snowflake source to be scanned.
118+
1. **Maximum memory available**: Maximum memory (in GB) available on customer's VM to be used by scanning processes. It's dependent on the size of Snowflake source to be scanned.
119119

120120
> [!Note]
121121
> As a thumb rule, please provide 1GB memory for every 1000 tables

0 commit comments

Comments
 (0)