Skip to content

Commit 341df13

Browse files
committed
S3UTILS-222 Tool to update CRR policies
1 parent b576902 commit 341df13

File tree

6 files changed

+1232
-36
lines changed

6 files changed

+1232
-36
lines changed

package.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,6 +26,7 @@
2626
},
2727
"homepage": "https://github.com/scality/s3utils#readme",
2828
"dependencies": {
29+
"@aws-sdk/client-iam": "^3.962.0",
2930
"@aws-sdk/client-s3": "^3.873.0",
3031
"@aws-sdk/node-http-handler": "^3.374.0",
3132
"@scality/cloudserverclient": "^1.0.4",
@@ -54,7 +55,6 @@
5455
"string-width": "4.2.3"
5556
},
5657
"devDependencies": {
57-
"@aws-sdk/client-iam": "^3.962.0",
5858
"@scality/eslint-config-scality": "scality/Guidelines#8.3.0",
5959
"@sinonjs/fake-timers": "^14.0.0",
6060
"eslint": "^9.14.0",

replicationAudit/README.md

Lines changed: 179 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -50,12 +50,19 @@ ansible -i env/$ENV_DIR/inventory runners_s3[0] -m shell \
5050
-a 'cat {{ env_host_logs}}/scality-vault{{ container_name_suffix | default("")}}/logs/missing.json' \
5151
| grep -v CHANGED | tee /root/replicationAudit_missing.json
5252

53-
# Step 6: Clean up
53+
# Step 6: Clean up remote files
5454
ansible -i env/$ENV_DIR/inventory runners_s3[0] -m shell \
5555
-a 'rm -f {{ env_host_logs}}/scality-vault{{ container_name_suffix | default("")}}/logs/missing.json \
5656
{{ env_host_logs}}/scality-vault{{ container_name_suffix | default("")}}/logs/check-replication-permissions.js \
5757
{{ env_host_logs}}/scality-vault{{ container_name_suffix | default("")}}/logs/buckets-with-replication.json \
5858
/root/list-buckets-with-replication.sh'
59+
60+
# Step 7 (optional): Fix missing permissions
61+
# Run from your local machine (requires vaultclient and @aws-sdk/client-iam)
62+
node replicationAudit/fix-missing-replication-permissions.js \
63+
/root/replicationAudit_missing.json <supervisor-ip> admin1.json
64+
65+
# Step 8: Re-run check to verify fixes (repeat steps 3-5)
5966
```
6067

6168
# Scripts Documentation
@@ -285,6 +292,13 @@ node check-replication-permissions.js [input-file] [leader-ip] [output-file] [--
285292

286293
### Output Format
287294

295+
> **Breaking change (since 1.17.5):** The output now includes `ownerDisplayName`
296+
> in each result entry. This field is required by
297+
> `fix-missing-replication-permissions.js` to identify accounts without an
298+
> extra API call. If you ran `check-replication-permissions.js` on version
299+
> 1.17.4 or earlier, **re-run it** to produce an output that
300+
> `fix-missing-replication-permissions.js` can consume.
301+
288302
The script produces a JSON file with metadata and results. The `results` array
289303
contains **only buckets missing the `s3:ReplicateObject` permission**.
290304

@@ -310,6 +324,7 @@ contains **only buckets missing the `s3:ReplicateObject` permission**.
310324
"results": [
311325
{
312326
"bucket": "bucket-old-1",
327+
"ownerDisplayName": "testaccount",
313328
"sourceRole": "arn:aws:iam::267390090509:role/crr-role-outdated",
314329
"policies": [
315330
{
@@ -461,3 +476,166 @@ Output saved to: /tmp/missing.json
461476
**Script timeout**
462477

463478
- For many buckets, run directly on the S3 connector node via interactive SSH
479+
480+
---
481+
482+
## fix-missing-replication-permissions.js
483+
484+
Reads the output of `check-replication-permissions.js` and creates IAM policies
485+
with `s3:ReplicateObject` for roles that are missing it.
486+
487+
The script applies **minimal changes**: one policy per role (covering all affected
488+
buckets), with an explicit Statement ID (`AllowReplicateObjectAuditFix`) so the
489+
policies are easily identifiable later.
490+
491+
### Prerequisites
492+
493+
- Output from `check-replication-permissions.js` (missing.json)
494+
- Vault admin credentials (`admin1.json` with `accessKey` and `secretKeyValue`).
495+
Found on the supervisor at:
496+
```
497+
/srv/scality/s3/s3-offline/federation/env/<ENV_DIR>/vault/admin-clientprofile/admin1.json
498+
```
499+
- Network access to Vault admin/IAM API (port 8600) from the machine running the script
500+
- `vaultclient` and `@aws-sdk/client-iam` installed (both in s3utils dependencies)
501+
502+
### Usage
503+
504+
```bash
505+
node fix-missing-replication-permissions.js <input-file> <vault-host> <admin-config> [output-file] [options]
506+
```
507+
508+
| Argument | Default | Description |
509+
|----------|---------|-------------|
510+
| `input-file` | (required) | Path to missing.json from check script |
511+
| `vault-host` | (required) | Vault admin host (e.g., 13.50.166.21) |
512+
| `admin-config` | (required) | Path to admin credentials JSON |
513+
| `output-file` | replication-fix-results.json | Output file path |
514+
| `--iam-port <port>` | 8600 | Vault admin and IAM API port |
515+
| `--https` | (not set) | Use HTTPS to connect to Vault |
516+
| `--dry-run` | (not set) | Show what would be done without making changes |
517+
518+
### How It Works
519+
520+
1. **Reads** the missing permissions file and groups buckets by account and role
521+
2. **Maps** account IDs to names using `ownerDisplayName` from the input (no API call)
522+
3. For each account:
523+
- **Generates** a temporary access key via vault admin API (15-minute auto-expiry)
524+
- **Creates** one IAM policy per role with `s3:ReplicateObject` for all affected buckets
525+
- **Attaches** the policy to the role
526+
- **Deletes** the temporary access key (falls back to auto-expiry on failure)
527+
4. **Writes** results to the output file
528+
529+
### Policy Created
530+
531+
For each role, the script creates a single policy named
532+
`s3-replication-audit-fix-<roleName>`:
533+
534+
```json
535+
{
536+
"Version": "2012-10-17",
537+
"Statement": [{
538+
"Sid": "AllowReplicateObjectAuditFix",
539+
"Effect": "Allow",
540+
"Action": "s3:ReplicateObject",
541+
"Resource": [
542+
"arn:aws:s3:::bucket-old-1/*",
543+
"arn:aws:s3:::bucket-old-2/*"
544+
]
545+
}]
546+
}
547+
```
548+
549+
### Output Format
550+
551+
```json
552+
{
553+
"metadata": {
554+
"timestamp": "2026-02-23T20:35:00.000Z",
555+
"durationMs": 65,
556+
"inputFile": "missing.json",
557+
"dryRun": false,
558+
"counts": {
559+
"totalRolesProcessed": 1,
560+
"totalBucketsFixed": 3,
561+
"policiesCreated": 1,
562+
"policiesAttached": 1,
563+
"keysCreated": 1,
564+
"keysDeleted": 1,
565+
"errors": 0
566+
}
567+
},
568+
"fixes": [
569+
{
570+
"accountId": "267390090509",
571+
"accountName": "testaccount",
572+
"roleName": "crr-role-outdated",
573+
"roleArn": "arn:aws:iam::267390090509:role/crr-role-outdated",
574+
"policyName": "s3-replication-audit-fix-crr-role-outdated",
575+
"policyArn": "arn:aws:iam::267390090509:policy/s3-replication-audit-fix-crr-role-outdated",
576+
"buckets": ["bucket-old-1", "bucket-old-2", "bucket-old-3"],
577+
"status": "success"
578+
}
579+
],
580+
"errors": []
581+
}
582+
```
583+
584+
### Example Run
585+
586+
```
587+
=== Fix Missing Replication Permissions ===
588+
Input: missing.json
589+
Output: replication-fix-results.json
590+
Vault/IAM: 13.50.166.21:8600
591+
592+
Processing 1 role(s)
593+
594+
[1/1] Role "crr-role-outdated" — account "testaccount" (3 bucket(s))
595+
Created policy "s3-replication-audit-fix-crr-role-outdated"
596+
Attached policy to role "crr-role-outdated"
597+
Deleted temp key for account "testaccount" (267390090509)
598+
599+
=== Summary ===
600+
Roles processed: 1
601+
Buckets fixed: 3
602+
Policies created: 1
603+
Policies attached: 1
604+
Keys created: 1
605+
Keys deleted: 1
606+
Errors: 0
607+
Duration: 0.7s
608+
Output saved to: replication-fix-results.json
609+
610+
Done.
611+
```
612+
613+
### Idempotency
614+
615+
The script is safe to run multiple times:
616+
617+
- If the policy already exists, it is reused (not duplicated)
618+
- Attaching an already-attached policy is a no-op in IAM
619+
- Temporary access keys auto-expire after 15 minutes even if deletion fails
620+
621+
### Troubleshooting
622+
623+
**"No ownerDisplayName found for account"**
624+
625+
- The input file is missing `ownerDisplayName`. Re-run `check-replication-permissions.js`
626+
to generate a fresh output that includes this field.
627+
628+
**"Failed to generate temp key"**
629+
630+
- Verify admin credentials in the config file
631+
- Ensure vault admin API is reachable on the specified host and port
632+
633+
**IAM operation errors**
634+
635+
- Check that the IAM port is correct (default 8600, may differ per deployment)
636+
- Verify the role still exists in vault
637+
638+
**"Failed to delete temp key"**
639+
640+
- Non-critical: the key auto-expires after 15 minutes
641+
- The error is logged but does not prevent other operations

replicationAudit/check-replication-permissions.js

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -421,7 +421,7 @@ async function main() {
421421

422422
// Process each bucket
423423
for (let i = 0; i < buckets.length; i++) {
424-
const { bucket, sourceRole } = buckets[i];
424+
const { bucket, sourceRole, ownerDisplayName } = buckets[i];
425425

426426
if (!sourceRole) {
427427
logProgress(i + 1, buckets.length, bucket, 'SKIP (no role)');
@@ -440,12 +440,13 @@ async function main() {
440440
} else {
441441
const reason = result.error || 's3:ReplicateObject';
442442
logProgress(i + 1, buckets.length, bucket, `MISSING: ${reason}`);
443+
result.ownerDisplayName = ownerDisplayName;
443444
results.push(result);
444445
stats.missing++;
445446
}
446447
} catch (e) {
447448
logProgress(i + 1, buckets.length, bucket, `ERROR: ${e.message}`);
448-
results.push({ bucket, sourceRole, error: e.message, policies: [] });
449+
results.push({ bucket, sourceRole, ownerDisplayName, error: e.message, policies: [] });
449450
stats.errors++;
450451
}
451452
}

0 commit comments

Comments
 (0)