@@ -134,10 +134,11 @@ Storage][google-cloud-storage] for export bucket functionality.
134134<InfoBox >
135135
136136Ensure the AWS credentials are correctly configured in IAM to allow reads and
137- writes to the export bucket in S3 if you are not using storage integration.
138- If you are using storage integration then you still need to configure access keys
139- for Cube Store to be able to read from the export bucket.
140- It's possible to authenticate with IAM roles instead of access keys for Cube Store.
137+ writes to the export bucket in S3. You can authenticate using:
138+
139+ - ** IAM user credentials** - Explicit AWS access keys and secrets
140+ - ** Storage integration** - Snowflake-managed AWS integration (may still require credentials for Cube Store)
141+ - ** IAM roles** - Use execution environment roles (IRSA, instance profiles) for both Snowflake and Cube Store
141142
142143</InfoBox >
143144
@@ -163,14 +164,21 @@ CUBEJS_DB_EXPORT_BUCKET_AWS_SECRET=<AWS_SECRET>
163164CUBEJS_DB_EXPORT_BUCKET_AWS_REGION=<AWS_REGION>
164165```
165166
166- Using Storage Integration to write to export bocket and IAM role to read from Cube Store:
167+ Using Storage Integration to write to export bucket and IAM role to read from Cube Store:
167168``` dotenv
168169CUBEJS_DB_EXPORT_BUCKET_TYPE=s3
169170CUBEJS_DB_EXPORT_BUCKET=my.bucket.on.s3
170171CUBEJS_DB_EXPORT_INTEGRATION=aws_int
171172CUBEJS_DB_EXPORT_BUCKET_AWS_REGION=<AWS_REGION>
172173```
173174
175+ Using IAM roles for both Snowflake and Cube Store (no credentials required):
176+ ``` dotenv
177+ CUBEJS_DB_EXPORT_BUCKET_TYPE=s3
178+ CUBEJS_DB_EXPORT_BUCKET=my.bucket.on.s3
179+ CUBEJS_DB_EXPORT_BUCKET_AWS_REGION=<AWS_REGION>
180+ ```
181+
174182
175183#### Google Cloud Storage
176184
0 commit comments