You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/content/docs/snowflake/features/authentication.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -32,9 +32,9 @@ sf_conn_obj = sf.connect(
32
32
33
33
The default username and password are set to `test` and can be changed using `SF_DEFAULT_USER` and `SF_DEFAULT_PASSWORD` when starting the Snowflake emulator.
34
34
35
-
{{< alert title="Note" >}}
35
+
:::note
36
36
It is not recommended to use your production credentials in the Snowflake emulator.
Copy file name to clipboardExpand all lines: src/content/docs/snowflake/features/clones.md
+1-3Lines changed: 1 addition & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,8 +3,6 @@ title: Clones
3
3
description: Get started with Clones in LocalStack for Snowflake
4
4
---
5
5
6
-
7
-
8
6
## Introduction
9
7
10
8
Cloning in Snowflake allows you to create a quick, zero-copy duplicate of an existing database, schema, or table. This feature enables users to replicate data structures and content for testing or development without duplicating the underlying storage.
description: Get started with cross-database resource sharing in the Snowflake emulator
4
4
---
5
5
6
-
7
-
8
6
## Introduction
9
7
10
8
Snowflake data providers can easily share data from various databases using secure views. These views can include schemas, tables, and other views from one or more databases, as long as they're part of the same account.
Copy file name to clipboardExpand all lines: src/content/docs/snowflake/features/dynamic-tables.md
+1-3Lines changed: 1 addition & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,8 +3,6 @@ title: Dynamic Tables
3
3
description: Get started with Dynamic Tables in LocalStack for Snowflake
4
4
---
5
5
6
-
7
-
8
6
## Introduction
9
7
10
8
Snowflake Dynamic Tables enable a background process to continuously load new data from sources into the table, supporting both delta and full load operations. A dynamic table automatically updates to reflect query results, removing the need for a separate target table and custom code for data transformation. This table is kept current through regularly scheduled refreshes by an automated process.
Copy file name to clipboardExpand all lines: src/content/docs/snowflake/features/hybrid-tables.md
+1-3Lines changed: 1 addition & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,8 +3,6 @@ title: Hybrid Tables
3
3
description: Get started with Hybrid Tables in LocalStack for Snowflake
4
4
---
5
5
6
-
7
-
8
6
## Introduction
9
7
10
8
Snowflake Hybrid tables, also known as Unistore hybrid tables, support fast, single-row operations by enforcing unique constraints for required primary keys and including indexes to speed up data retrieval. These tables are designed to optimize support for both analytical and transactional workloads simultaneously, underpinning Snowflake's Unistore architecture.
Copy file name to clipboardExpand all lines: src/content/docs/snowflake/features/iceberg-tables.md
+3-5Lines changed: 3 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,8 +3,6 @@ title: Iceberg Tables
3
3
description: This is a dummy description.
4
4
---
5
5
6
-
7
-
8
6
## Introduction
9
7
10
8
Iceberg tables uses [Apache Iceberg](https://iceberg.apache.org/) open table format specification to provide an abstraction layer on data files stored in open formats. Iceberg tables for Snowflake offer schema evolution, partitioning, and snapshot isolation to manage the table data efficiently.
Copy file name to clipboardExpand all lines: src/content/docs/snowflake/features/snowpipe.md
+14-14Lines changed: 14 additions & 14 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,10 +9,10 @@ Snowpipe allows you to load data into Snowflake tables from files stored in an e
9
9
10
10
The Snowflake emulator supports Snowpipe, allowing you to create and manage Snowpipe objects in the emulator. You can use Snowpipe to load data into Snowflake tables from files stored in a local directory or a local/remote S3 bucket. The following operations are supported:
@@ -24,9 +24,9 @@ In this guide, you will create a stage, and a pipe to load data from a local S3
24
24
25
25
You can create a local S3 bucket using the `mb` command with the `awslocal` CLI.
26
26
27
-
{{< command >}}
28
-
$ awslocal s3 mb s3://test-bucket
29
-
{{< / command >}}
27
+
```bash
28
+
awslocal s3 mb s3://test-bucket
29
+
```
30
30
31
31
### Create a stage
32
32
@@ -73,11 +73,11 @@ Retrieve the `notification_channel` value from the output of the `DESC PIPE` que
73
73
74
74
You can use the [`PutBucketNotificationConfiguration`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_PutBucketNotificationConfiguration.html) API to create a bucket notification configuration that sends notifications to Snowflake when new files are uploaded to the S3 bucket.
The `notification.json` file should contain the following configuration:
83
83
@@ -107,14 +107,14 @@ Copy a JSON file to the S3 bucket to trigger the pipe to load the data into the
107
107
108
108
Upload the file to the S3 bucket:
109
109
110
-
{{< command >}}
111
-
$ awslocal s3 cp test.json s3://test-bucket/
112
-
{{< / command >}}
110
+
```bash
111
+
awslocal s3 cp test.json s3://test-bucket/
112
+
```
113
113
114
114
### Check the data
115
115
116
116
After uploading the file to the S3 bucket in the previous step, the contents of the JSON file should get inserted into the table automatically by the `test_pipe` pipe. You can check the data in the table using the following query:
In this example, you can upload the CSV files to the table stage provided for `employees` table.
62
61
63
-
{{< tabpane >}}
64
-
{{< tab header="Linux/macOS" lang="sql" >}}
62
+
63
+
<Tabs>
64
+
<TabItemlabel="Linux/macOS">
65
+
```sql
65
66
PUT file://./employees0*.csv @@employees_stage AUTO_COMPRESS=TRUE;
66
-
{{< /tab >}}
67
-
{{< tab header="Windows" lang="sql" >}}
67
+
```
68
+
</TabItem>
69
+
<TabItemlabel="Windows">
70
+
```sql
68
71
PUT file://C:\temp\employees0*.csv @@employees_stage AUTO_COMPRESS=TRUE;
69
-
{{< /tab >}}
70
-
{{< /tabpane >}}
72
+
```
73
+
</TabItem>
74
+
</Tabs>
75
+
76
+
77
+
71
78
72
79
The expected output is:
73
80
@@ -87,10 +94,10 @@ The expected output is:
87
94
88
95
You can also load data from an S3 bucket using the `CREATE STAGE` command. Create a new S3 bucket named `testbucket` and upload the [employees CSV files](./getting-started.zip) to the bucket. You can use LocalStack's `awslocal` CLI to create the S3 bucket and upload the files.
89
96
90
-
{{< command >}}
97
+
```bash
91
98
awslocal s3 mb s3://testbucket
92
99
awslocal s3 cp employees0*.csv s3://testbucket
93
-
{{< /command >}}
100
+
```
94
101
95
102
In this example, you can create a stage called `my_s3_stage` to load data from an S3 bucket:
96
103
@@ -107,4 +114,4 @@ You can further copy data from the S3 stage to the table using the `COPY INTO` c
0 commit comments