Skip to content

Commit f7340ea

Browse files
committed
finish integrations
1 parent 9510e39 commit f7340ea

File tree

12 files changed

+127
-137
lines changed

12 files changed

+127
-137
lines changed
696 KB
Loading

src/content/docs/snowflake/integrations/airflow.md

Lines changed: 14 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,6 @@ title: Airflow
33
description: Use Airflow to run local ETL jobs against the Snowflake emulator
44
---
55

6-
7-
86
## Introduction
97

108
Apache [Airflow](https://airflow.apache.org) is a platform for running data-centric workflows and scheduled compute jobs.
@@ -17,19 +15,20 @@ On this page we outline how to set up the connection between local Airflow and t
1715

1816
In order to create an Airflow environment in local MWAA, we can use the [`awslocal`](https://github.com/localstack/awscli-local) command:
1917

20-
{{< command>}}
21-
$ awslocal s3 mb s3://my-mwaa-bucket
22-
$ awslocal mwaa create-environment --dag-s3-path /dags \
18+
```bash
19+
awslocal s3 mb s3://my-mwaa-bucket
20+
awslocal mwaa create-environment --dag-s3-path /dags \
2321
--execution-role-arn arn:aws:iam::000000000000:role/airflow-role \
2422
--network-configuration {} \
2523
--source-bucket-arn arn:aws:s3:::my-mwaa-bucket \
2624
--airflow-version 2.6.3 \
2725
--name my-mwaa-env
28-
{{< /command >}}
26+
```
2927

3028
## Create an Airflow DAG script that connects to LocalStack Snowflake
3129

3230
We can then create a local file `my_dag.py` with the Airflow DAG definition, for example:
31+
3332
```python
3433
import datetime
3534
import json
@@ -85,6 +84,7 @@ In order to use the `SnowflakeOperator` in your Airflow DAG, a small patch is re
8584
The code listings below contain the patch for different Airflow versions - simply copy the relevant snippet and paste it into the top of your DAG script (e.g., `my_dag.py`).
8685

8786
**Airflow version 2.6.3 and above**:
87+
8888
```python
8989
# ---
9090
# patch for local Snowflake connection, for Airflow 2.6.3 and above
@@ -108,6 +108,7 @@ SnowflakeHook._get_conn_params = _get_conn_params
108108
```
109109

110110
**Airflow version 2.9.2 and above**:
111+
111112
```python
112113
# ---
113114
# patch for local Snowflake connection, for Airflow 2.9.2 / 2.10.1
@@ -131,15 +132,16 @@ SnowflakeHook._get_conn_params = _get_conn_params
131132
# ... rest of your DAG script below ...
132133
```
133134

134-
{{< alert type="info" title="Note" >}}
135+
:::note
135136
In a future release, we're looking to integrate these patches directly into the LocalStack environment, such that users do not need to apply these patches in DAG scripts manually.
136-
{{< /alert >}}
137+
:::
137138

138139
## Deploying the DAG to Airflow
139140

140141
Next, we copy the `my_dag.py` file to the `/dags` folder within the `my-mwaa-bucket` S3 bucket, to trigger the deployment of the DAG in Airflow:
141-
{{< command>}}
142-
$ awslocal s3 cp my_dag.py s3://my-mwaa-bucket/dags/
143-
{{< /command >}}
144142

145-
You should then be able to open the Airflow UI (e.g., http://localhost.localstack.cloud:4510/dags) to view the status of the DAG and trigger a DAG run.
143+
```bash
144+
awslocal s3 cp my_dag.py s3://my-mwaa-bucket/dags/
145+
```
146+
147+
You should then be able to open the Airflow UI (e.g., http://localhost.localstack.cloud:4510/dags) to view the status of the DAG and trigger a DAG run.

src/content/docs/snowflake/integrations/continuous-integration.md renamed to src/content/docs/snowflake/integrations/continuous-integration.mdx

Lines changed: 15 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ title: Continuous Integration
33
description: Get started with Snowflake emulator in continuous integration (CI) environments.
44
---
55

6-
6+
import { Tabs, TabItem } from '@astrojs/starlight/components';
77

88
## Introduction
99

@@ -26,8 +26,9 @@ To create a CI key, follow these steps:
2626

2727
The following examples demonstrate how to set up the emulator in GitHub Actions, CircleCI, and GitLab CI.
2828

29-
{{< tabpane >}}
30-
{{< tab header="GitHub Actions" lang="yaml" >}}
29+
<Tabs>
30+
<TabItem label="GitHub Actions">
31+
```yaml
3132
name: LocalStack Test
3233
on: [ push, pull_request ]
3334

@@ -48,8 +49,10 @@ jobs:
4849
echo "Startup complete"
4950
env:
5051
LOCALSTACK_API_KEY: ${{ secrets.LOCALSTACK_API_KEY }}
51-
{{< /tab >}}
52-
{{< tab header="CircleCI" lang="yaml" >}}
52+
```
53+
</TabItem>
54+
<TabItem label="CircleCI">
55+
```yaml
5356
version: 2.1
5457

5558
orbs:
@@ -79,8 +82,10 @@ workflows:
7982
build:
8083
jobs:
8184
- example-job
82-
{{< /tab >}}
83-
{{< tab header="GitLab CI" lang="yaml" >}}
85+
```
86+
</TabItem>
87+
<TabItem label="GitLab CI">
88+
```yaml
8489
image: docker:20.10.16
8590

8691
stages:
@@ -108,5 +113,6 @@ test:
108113
- echo "${dind_ip} localhost.localstack.cloud " >> /etc/hosts
109114
- DOCKER_HOST="tcp://${dind_ip}:2375" IMAGE_NAME=localstack/snowflake localstack start -d
110115
- localstack wait -t 15
111-
{{< /tab >}}
112-
{{< /tabpane >}}
116+
```
117+
</TabItem>
118+
</Tabs>

src/content/docs/snowflake/integrations/dbeaver.md

Lines changed: 2 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,6 @@ title: DBeaver
33
description: Use DBeaver to interact with the Snowflake emulator
44
---
55

6-
7-
86
## Introduction
97

108
[DBeaver](https://dbeaver.io/) is a free and open-source universal database tool for developers, database administrators, and analysts. DBeaver provides a wide range of features, such as executing SQL statements, viewing and editing data, managing database objects, and more.
@@ -28,10 +26,8 @@ To create a new connection in DBeaver, follow these steps:
2826
- **Host**: `snowflake.localhost.localstack.cloud`
2927
- **User**: `test`
3028
- **Password**: `test`
31-
<img src="dbeaver-new-connection.png" alt="New connection in DBeaver" width="700"/>
32-
<br><br>
33-
29+
![New connection in DBeaver](/images/snowflake/dbeaver-new-connection.png)
3430
- Click **Test Connection**.
3531
- If the connection test succeeds, click **Finish**. The Snowflake database will appear in DBeaver's Database Navigator.
3632

37-
You can verify the connection by running a query to check the Snowflake version: `SELECT CURRENT_VERSION();`
33+
You can verify the connection by running a query to check the Snowflake version: `SELECT CURRENT_VERSION();`

src/content/docs/snowflake/integrations/dbt.md

Lines changed: 14 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,6 @@ title: dbt
33
description: Use dbt to interact with the Snowflake emulator
44
---
55

6-
7-
86
## Introduction
97

108
[dbt (data build tool)](https://www.getdbt.com/) is a transformation workflow tool that enables data analysts and engineers to transform data in their warehouses by writing modular SQL. dbt handles version control, documentation, and modularity for data transformations.
@@ -19,9 +17,9 @@ In this guide, you will learn how to configure dbt to interact with the Snowflak
1917

2018
First, install dbt with the Snowflake adapter:
2119

22-
{{< command >}}
23-
$ pip install dbt-snowflake
24-
{{< /command >}}
20+
```bash
21+
pip install dbt-snowflake
22+
```
2523

2624
### Configure dbt Profile
2725

@@ -49,9 +47,9 @@ localstack_snowflake:
4947
5048
To verify your dbt configuration is working correctly with the Snowflake emulator, run:
5149
52-
{{< command >}}
53-
$ dbt debug --profile localstack_snowflake
54-
{{< /command >}}
50+
```bash
51+
dbt debug --profile localstack_snowflake
52+
```
5553

5654
You should see output indicating a successful connection to the Snowflake emulator.
5755

@@ -96,14 +94,15 @@ models:
9694
- not_null
9795
```
9896
99-
{{< command >}}
97+
You can run all models and tests with the following commands:
98+
99+
```bash
100100
# Run all models
101-
$ dbt run --profile localstack_snowflake
101+
dbt run --profile localstack_snowflake
102102

103103
# Run tests
104-
$ dbt test --profile localstack_snowflake
105-
106-
{{< /command >}}
104+
dbt test --profile localstack_snowflake
105+
```
107106

108107
### Project Structure
109108

@@ -149,6 +148,6 @@ models:
149148
3. **Documentation**: Document your models using dbt's built-in documentation features
150149
4. **Modularity**: Break down complex transformations into smaller, reusable models
151150
152-
{{< alert type="info" >}}
151+
:::note
153152
It's a good practice to always test your dbt models locally with the Snowflake emulator before deploying to production, to save time and resources.
154-
{{< /alert >}}
153+
:::

src/content/docs/snowflake/integrations/flyway.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,6 @@ title: Flyway
33
description: Use Flyway to interact with the Snowflake emulator
44
---
55

6-
7-
86
## Introduction
97

108
[Flyway](https://flywaydb.org/) is an open-source database migration tool that simplifies the process of managing and applying database migrations. Flyway supports various databases, including Snowflake, allowing you to manage database schema changes, version control, and data migration in a structured and automated way.
@@ -37,4 +35,4 @@ To connect Flyway to the Snowflake emulator, follow these steps:
3735
* Enter JDBC URL as `jdbc:snowflake://http://snowflake.localhost.localstack.cloud:4566/?db=test&schema=PUBLIC&JDBC_QUERY_RESULT_FORMAT=JSON`.
3836
* Click on **Test connection**.
3937

40-
If the connection test succeeds, you can start applying database migrations using Flyway.
38+
If the connection test succeeds, you can start applying database migrations using Flyway.

src/content/docs/snowflake/integrations/pulumi.md

Lines changed: 14 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,6 @@ title: Pulumi
33
description: Use Pulumi to interact with the Snowflake emulator
44
---
55

6-
7-
86
## Introduction
97

108
[Pulumi](https://pulumi.com/) is an Infrastructure-as-Code (IaC) framework that allows you to define and provision infrastructure using familiar programming languages. Pulumi supports a wide range of cloud providers and services, including AWS, Azure, Google Cloud, and more.
@@ -19,19 +17,19 @@ In this guide, you will learn how to configure Pulumi to interact with the Snowf
1917

2018
To use Pulumi with the Snowflake emulator, you need to configure the Snowflake provider in your Pulumi configuration file. Create a blank Pulumi project, and add the following environment variables to your Pulumi stack:
2119

22-
{{< command>}}
23-
$ pulumi config set snowflake:account test
24-
$ pulumi config set snowflake:region test
25-
$ pulumi config set snowflake:username test
26-
$ pulumi config set snowflake:password test
27-
$ pulumi config set snowflake:host snowflake.localhost.localstack.cloud
28-
{{< /command >}}
20+
```bash
21+
pulumi config set snowflake:account test
22+
pulumi config set snowflake:region test
23+
pulumi config set snowflake:username test
24+
pulumi config set snowflake:password test
25+
pulumi config set snowflake:host snowflake.localhost.localstack.cloud
26+
```
2927

3028
You can install the Snowflake provider in any of the programming languages supported by Pulumi, such as Python, JavaScript, TypeScript, and Go. The following example shows how to install the Snowflake provider for your TypeScript project:
3129

32-
{{< command >}}
33-
$ npm install @pulumi/snowflake
34-
{{< /command >}}
30+
```bash
31+
npm install @pulumi/snowflake
32+
```
3533

3634
### Create Snowflake resources
3735

@@ -50,9 +48,9 @@ const simple = new snowflake.Database("simple", {
5048

5149
You can now deploy the Pulumi configuration to create the Snowflake resources locally. Run the following command to deploy the Pulumi configuration:
5250

53-
{{< command >}}
54-
$ pulumi up
55-
{{< /command >}}
51+
```bash
52+
pulumi up
53+
```
5654

5755
The expected output should show the resources being created in the Snowflake emulator:
5856

@@ -75,4 +73,4 @@ Resources:
7573
+ 2 created
7674

7775
Duration: 5s
78-
```
76+
```

0 commit comments

Comments
 (0)