@@ -133,15 +133,13 @@ Storage][google-cloud-storage] for export bucket functionality.
133133
134134<InfoBox >
135135
136- Ensure the AWS credentials are correctly configured in IAM to allow reads and
137- writes to the export bucket in S3 if you are not using storage integration.
138- If you are using storage integration then you still need to configure access keys
139- for Cube Store to be able to read from the export bucket.
140- It's possible to authenticate with IAM roles instead of access keys for Cube Store.
136+ Ensure proper IAM privileges are configured for S3 bucket reads and writes, using either
137+ storage integration or user credentials for Snowflake and either IAM roles/IRSA or user
138+ credentials for Cube Store, with mixed configurations supported.
141139
142140</InfoBox >
143141
144- Using IAM user credentials:
142+ Using IAM user credentials for both :
145143
146144``` dotenv
147145CUBEJS_DB_EXPORT_BUCKET_TYPE=s3
@@ -151,8 +149,8 @@ CUBEJS_DB_EXPORT_BUCKET_AWS_SECRET=<AWS_SECRET>
151149CUBEJS_DB_EXPORT_BUCKET_AWS_REGION=<AWS_REGION>
152150```
153151
154- [ Using Storage Integration] [ snowflake-docs-aws-integration ] to write to Export Bucket and
155- then Access Keys to read from Cube Store:
152+ Using a [ Storage Integration] [ snowflake-docs-aws-integration ] to write to export buckets and
153+ user credentials to read from Cube Store:
156154
157155``` dotenv
158156CUBEJS_DB_EXPORT_BUCKET_TYPE=s3
@@ -163,7 +161,8 @@ CUBEJS_DB_EXPORT_BUCKET_AWS_SECRET=<AWS_SECRET>
163161CUBEJS_DB_EXPORT_BUCKET_AWS_REGION=<AWS_REGION>
164162```
165163
166- Using Storage Integration to write to export bucket and IAM role to read from Cube Store:
164+ Using a Storage Integration to write to export bucket and IAM role/IRSA to read from Cube Store:**
165+
167166``` dotenv
168167CUBEJS_DB_EXPORT_BUCKET_TYPE=s3
169168CUBEJS_DB_EXPORT_BUCKET=my.bucket.on.s3
0 commit comments