Skip to content

Commit f5367b5

Browse files
committed
revert docs changes
1 parent fad29b1 commit f5367b5

File tree

1 file changed

+4
-12
lines changed

1 file changed

+4
-12
lines changed

docs/pages/product/configuration/data-sources/snowflake.mdx

Lines changed: 4 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -134,11 +134,10 @@ Storage][google-cloud-storage] for export bucket functionality.
134134
<InfoBox>
135135

136136
Ensure the AWS credentials are correctly configured in IAM to allow reads and
137-
writes to the export bucket in S3. You can authenticate using:
138-
139-
- **IAM user credentials** - Explicit AWS access keys and secrets
140-
- **Storage integration** - Snowflake-managed AWS integration (may still require credentials for exported file processing)
141-
- **IAM roles** - Use execution environment roles (IRSA, instance profiles) for both Snowflake and Cube Store
137+
writes to the export bucket in S3 if you are not using storage integration.
138+
If you are using storage integration then you still need to configure access keys
139+
for Cube Store to be able to read from the export bucket.
140+
It's possible to authenticate with IAM roles instead of access keys for Cube Store.
142141

143142
</InfoBox>
144143

@@ -172,13 +171,6 @@ CUBEJS_DB_EXPORT_INTEGRATION=aws_int
172171
CUBEJS_DB_EXPORT_BUCKET_AWS_REGION=<AWS_REGION>
173172
```
174173

175-
Using IAM roles for both Snowflake and Cube Store (no credentials required):
176-
```dotenv
177-
CUBEJS_DB_EXPORT_BUCKET_TYPE=s3
178-
CUBEJS_DB_EXPORT_BUCKET=my.bucket.on.s3
179-
CUBEJS_DB_EXPORT_BUCKET_AWS_REGION=<AWS_REGION>
180-
```
181-
182174

183175
#### Google Cloud Storage
184176

0 commit comments

Comments
 (0)