Skip to content

Commit 56bc552

Browse files
committed
remove hugo semantics
1 parent ba3ecc5 commit 56bc552

16 files changed

+62
-75
lines changed

src/content/docs/snowflake/features/accounts.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -47,7 +47,7 @@ You can also specify the account for Snowflake drivers that let you connect with
4747

4848
Example establishing a JDBC connection:
4949

50-
```
50+
```text
5151
jdbc:snowflake://snowflake.localhost.localstack.cloud:4566/?account=your_account
5252
```
5353

@@ -67,4 +67,4 @@ The query statement will return the name of the account you are currently connec
6767
|------------------------------------------|
6868
| YOUR_ACCOUNT |
6969
+------------------------------------------+
70-
```
70+
```

src/content/docs/snowflake/features/authentication.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -32,9 +32,9 @@ sf_conn_obj = sf.connect(
3232

3333
The default username and password are set to `test` and can be changed using `SF_DEFAULT_USER` and `SF_DEFAULT_PASSWORD` when starting the Snowflake emulator.
3434

35-
{{< alert title="Note" >}}
35+
:::note
3636
It is not recommended to use your production credentials in the Snowflake emulator.
37-
{{< /alert >}}
37+
:::
3838

3939
## RSA key pair authentication
4040

@@ -66,6 +66,6 @@ conn = snowflake.connector.connect(
6666
)
6767
```
6868

69-
{{< alert title="Note" >}}
69+
:::note
7070
The Snowflake emulator does not validate key contents—RSA authentication is mocked for local testing only.
71-
{{< /alert >}}
71+
:::

src/content/docs/snowflake/features/clones.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,6 @@ title: Clones
33
description: Get started with Clones in LocalStack for Snowflake
44
---
55

6-
7-
86
## Introduction
97

108
Cloning in Snowflake allows you to create a quick, zero-copy duplicate of an existing database, schema, or table. This feature enables users to replicate data structures and content for testing or development without duplicating the underlying storage.
@@ -110,4 +108,4 @@ The expected output is:
110108
| 1 | test |
111109
+----+------+
112110
1 Row(s) produced. Time Elapsed: 0.012s
113-
```
111+
```

src/content/docs/snowflake/features/cross-database-resource-sharing.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,6 @@ title: Cross-Database Resource Sharing
33
description: Get started with cross-database resource sharing in the Snowflake emulator
44
---
55

6-
7-
86
## Introduction
97

108
Snowflake data providers can easily share data from various databases using secure views. These views can include schemas, tables, and other views from one or more databases, as long as they're part of the same account.
@@ -97,4 +95,4 @@ The expected output is:
9795

9896
```plaintext
9997
(1, 2, 3)
100-
```
98+
```

src/content/docs/snowflake/features/dynamic-tables.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,6 @@ title: Dynamic Tables
33
description: Get started with Dynamic Tables in LocalStack for Snowflake
44
---
55

6-
7-
86
## Introduction
97

108
Snowflake Dynamic Tables enable a background process to continuously load new data from sources into the table, supporting both delta and full load operations. A dynamic table automatically updates to reflect query results, removing the need for a separate target table and custom code for data transformation. This table is kept current through regularly scheduled refreshes by an automated process.
@@ -91,4 +89,4 @@ The output should be:
9189
| 1 | foo |
9290
| 2 | bar |
9391
+----+------+
94-
```
92+
```

src/content/docs/snowflake/features/hybrid-tables.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,6 @@ title: Hybrid Tables
33
description: Get started with Hybrid Tables in LocalStack for Snowflake
44
---
55

6-
7-
86
## Introduction
97

108
Snowflake Hybrid tables, also known as Unistore hybrid tables, support fast, single-row operations by enforcing unique constraints for required primary keys and including indexes to speed up data retrieval. These tables are designed to optimize support for both analytical and transactional workloads simultaneously, underpinning Snowflake's Unistore architecture.
@@ -73,4 +71,4 @@ The output should be:
7371
| -----------------------------------------+
7472
| TEST-TABLE successfully dropped. |
7573
+------------------------------------------+
76-
```
74+
```

src/content/docs/snowflake/features/iceberg-tables.md

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,6 @@ title: Iceberg Tables
33
description: This is a dummy description.
44
---
55

6-
7-
86
## Introduction
97

108
Iceberg tables uses [Apache Iceberg](https://iceberg.apache.org/) open table format specification to provide an abstraction layer on data files stored in open formats. Iceberg tables for Snowflake offer schema evolution, partitioning, and snapshot isolation to manage the table data efficiently.
@@ -81,6 +79,6 @@ The output should be:
8179

8280
You can also list the content of the S3 bucket:
8381

84-
{{< command >}}
85-
$ awslocal s3 ls --recursive s3://test-bucket/
86-
{{< / command >}}
82+
```bash
83+
awslocal s3 ls --recursive s3://test-bucket/
84+
```

src/content/docs/snowflake/features/native-apps.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -66,14 +66,14 @@ You can access the Native App by visiting your preferred browser and navigating
6666
https://snowflake.localhost.localstack.cloud:4566/apps/test/test/TASKS_STREAMS_APP_username/
6767
```
6868
69-
{{< alert title="Note" >}}
69+
:::note
7070
The URL above is an example. Change the outputted URL by:
7171
7272
1. Replacing `https://app.snowflake.com` with `https://snowflake.localhost.localstack.cloud:4566`.
7373
2. Changing the path structure from `/#/apps/application/` to `/apps/test/test/`.
7474

7575
You can make additional changes depending on your local setup.
76-
{{< /alert >}}
76+
:::
7777

7878
The following app should be displayed:
7979

src/content/docs/snowflake/features/snowpipe.md

Lines changed: 14 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -9,10 +9,10 @@ Snowpipe allows you to load data into Snowflake tables from files stored in an e
99

1010
The Snowflake emulator supports Snowpipe, allowing you to create and manage Snowpipe objects in the emulator. You can use Snowpipe to load data into Snowflake tables from files stored in a local directory or a local/remote S3 bucket. The following operations are supported:
1111

12-
* `CREATE PIPE`
13-
* `DESCRIBE PIPE`
14-
* `DROP PIPE`
15-
* `SHOW PIPES`
12+
* [`CREATE PIPE`](https://docs.snowflake.com/en/sql-reference/sql/create-pipe)
13+
* [`DESCRIBE PIPE`](https://docs.snowflake.com/en/sql-reference/sql/describe-pipe)
14+
* [`DROP PIPE`](https://docs.snowflake.com/en/sql-reference/sql/drop-pipe)
15+
* [`SHOW PIPES`](https://docs.snowflake.com/en/sql-reference/sql/show-pipes)
1616

1717
## Getting started
1818

@@ -24,9 +24,9 @@ In this guide, you will create a stage, and a pipe to load data from a local S3
2424

2525
You can create a local S3 bucket using the `mb` command with the `awslocal` CLI.
2626

27-
{{< command >}}
28-
$ awslocal s3 mb s3://test-bucket
29-
{{< / command >}}
27+
```bash
28+
awslocal s3 mb s3://test-bucket
29+
```
3030

3131
### Create a stage
3232

@@ -73,11 +73,11 @@ Retrieve the `notification_channel` value from the output of the `DESC PIPE` que
7373

7474
You can use the [`PutBucketNotificationConfiguration`](https://docs.aws.amazon.com/AmazonS3/latest/API/API_PutBucketNotificationConfiguration.html) API to create a bucket notification configuration that sends notifications to Snowflake when new files are uploaded to the S3 bucket.
7575

76-
{{< command >}}
77-
$ awslocal s3api put-bucket-notification-configuration \
76+
```bash
77+
awslocal s3api put-bucket-notification-configuration \
7878
--bucket test-bucket \
7979
--notification-configuration file://notification.json
80-
{{< / command >}}
80+
```
8181

8282
The `notification.json` file should contain the following configuration:
8383

@@ -107,14 +107,14 @@ Copy a JSON file to the S3 bucket to trigger the pipe to load the data into the
107107

108108
Upload the file to the S3 bucket:
109109

110-
{{< command >}}
111-
$ awslocal s3 cp test.json s3://test-bucket/
112-
{{< / command >}}
110+
```bash
111+
awslocal s3 cp test.json s3://test-bucket/
112+
```
113113

114114
### Check the data
115115

116116
After uploading the file to the S3 bucket in the previous step, the contents of the JSON file should get inserted into the table automatically by the `test_pipe` pipe. You can check the data in the table using the following query:
117117

118118
```sql
119119
SELECT * FROM my_test_table
120-
```
120+
```

src/content/docs/snowflake/features/stages.md renamed to src/content/docs/snowflake/features/stages.mdx

Lines changed: 18 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -2,8 +2,7 @@
22
title: Stages
33
description: Get started with Stages in LocalStack for Snowflake
44
---
5-
6-
5+
import { Tabs, TabItem } from '@astrojs/starlight/components';
76

87
## Introduction
98

@@ -60,14 +59,22 @@ FILE_FORMAT = csv;
6059

6160
In this example, you can upload the CSV files to the table stage provided for `employees` table.
6261

63-
{{< tabpane >}}
64-
{{< tab header="Linux/macOS" lang="sql" >}}
62+
63+
<Tabs>
64+
<TabItem label="Linux/macOS">
65+
```sql
6566
PUT file://./employees0*.csv @@employees_stage AUTO_COMPRESS=TRUE;
66-
{{< /tab >}}
67-
{{< tab header="Windows" lang="sql" >}}
67+
```
68+
</TabItem>
69+
<TabItem label="Windows">
70+
```sql
6871
PUT file://C:\temp\employees0*.csv @@employees_stage AUTO_COMPRESS=TRUE;
69-
{{< /tab >}}
70-
{{< /tabpane >}}
72+
```
73+
</TabItem>
74+
</Tabs>
75+
76+
77+
7178

7279
The expected output is:
7380

@@ -87,10 +94,10 @@ The expected output is:
8794

8895
You can also load data from an S3 bucket using the `CREATE STAGE` command. Create a new S3 bucket named `testbucket` and upload the [employees CSV files](./getting-started.zip) to the bucket. You can use LocalStack's `awslocal` CLI to create the S3 bucket and upload the files.
8996

90-
{{< command >}}
97+
```bash
9198
awslocal s3 mb s3://testbucket
9299
awslocal s3 cp employees0*.csv s3://testbucket
93-
{{< /command >}}
100+
```
94101

95102
In this example, you can create a stage called `my_s3_stage` to load data from an S3 bucket:
96103

@@ -107,4 +114,4 @@ You can further copy data from the S3 stage to the table using the `COPY INTO` c
107114
COPY INTO mytable
108115
FROM @my_s3_stage
109116
PATTERN='.*employees.*.csv';
110-
```
117+
```

0 commit comments

Comments
 (0)