|
1 | 1 | # User Transfer Data Migration |
2 | 2 |
|
3 | | -The purpose of this tool is to transfer templates stored in DynamoDB from one owner to another. It does not transfer ownership of any files in S3, this would need to be done separately. |
| 3 | +The purpose of this tool is to transfer templates stored in DynamoDB from a single user owner to the user's client. |
4 | 4 |
|
5 | | -The owner field is the partition key for database entries and as such they are deleted and re-created with the new owner. They will retain their unique ID. |
| 5 | +This is a 2-stage process: |
6 | 6 |
|
7 | | -A local backup of the data is persisted before any updates into a `./backup` directory. |
| 7 | +1. Plan (your migration) |
| 8 | +2. Apply (your migration) |
8 | 9 |
|
9 | | -## Parameters |
| 10 | +## Plan (your migration) |
10 | 11 |
|
11 | | -| Parameter | Optional | Description | |
12 | | -| ------------------ | -------- | ----------------------------------------------- | |
13 | | -| --sourceOwner | Required | The current owner of the data, typically a UUID | |
14 | | -| --destinationOwner | Required | The new owner of the data, typically a UUID | |
15 | | -| --environment | Required | The environment name, e.g. main | |
16 | | -| | | | |
| 12 | +This creates a `transfer-plan-*.json` file in `./migrations` local directory and a copy in `main-acct-migration-backup/<environment>/transfer-plan-*/**` S3 bucket. |
| 13 | + |
| 14 | +```bash |
| 15 | +npm run plan -- \ |
| 16 | + --environment "main" \ |
| 17 | + --userPoolId "abc123" \ |
| 18 | + --iamAccessKeyId "abc1234" \ |
| 19 | + --iamSecretAccessKey "abc123" \ |
| 20 | + --iamSessionToken "abc123" |
| 21 | +``` |
| 22 | + |
| 23 | +### Parameters |
| 24 | + |
| 25 | +| Parameter | Optional | Description | |
| 26 | +| --------------------- | -------- | ---------------------------------------------------------------------------------------------- | |
| 27 | +| --environment | Required | The environment name, e.g. main | |
| 28 | +| --userPoolId | Required | This Cognito `UserPoolId` (if running in `sbx` then this can be your `sbx` Cognito userPoolId) | |
| 29 | +| --iamAccessKeyId | Optional | Access key id of the IAM account (dev/prod) | |
| 30 | +| --iamSecretAccessKey | Optional | Secret Access key of the IAM account (dev/prod) | |
| 31 | +| --iamSessionToken | Optional | Session token of the IAM account (dev/prod) | |
| 32 | + |
| 33 | +#### Why? |
| 34 | + |
| 35 | +The `transfer-plan-*.json` is used to keep a record of the data that will be migrated to Client ownership. |
| 36 | + |
| 37 | +## Apply (your migration) |
| 38 | + |
| 39 | +Run the migration process for data stored in `transfer-plan-*.json`. |
| 40 | + |
| 41 | +When doing a `dryRun=false` the data will be backed up: |
| 42 | + |
| 43 | +1. transfers all user related files in `internal` S3 bucket |
| 44 | +2. retrieves and stores all related DynamoDB data before executing |
| 45 | + |
| 46 | +Backed-up data is stored `main-acct-migration-backup/<environment>/transfer-plan-*/**` |
| 47 | + |
| 48 | +```bash |
| 49 | +npm run apply -- \ |
| 50 | + --environment "main" \ |
| 51 | + --file "./migrations/file.json" \ |
| 52 | + --dryRun "true" |
| 53 | +``` |
| 54 | + |
| 55 | +The result of this script will output a file named the same as the input file but with `run/dryrun` appended to the name. This file is a record of what happened to each migration, whether it failed or passed and which stage the migration ended at. |
| 56 | + |
| 57 | +### Parameters |
| 58 | + |
| 59 | +| Parameter | Optional | Description | |
| 60 | +| --------------------- | -------- | ---------------------------------------------------------------------------------------------- | |
| 61 | +| --environment | Required | The environment name, e.g. main | |
| 62 | +| --file | Required | The path of the `transfer-plan-*.json` file | |
| 63 | +| --dryRun | Optional | Defaults to `true`. When true will _not_ execute migration | |
17 | 64 |
|
18 | 65 | ## Authentication |
19 | 66 |
|
20 | | -You should establish a local AWS authentication session to the target AWS account with sufficient permissions to read, write and delete template data from DynamoDB. |
| 67 | +You should establish a local AWS authentication session to the Templates (dev/prod) AWS account with sufficient permissions to read, write and delete template data from DynamoDB. |
21 | 68 |
|
22 | | -## Example |
| 69 | +## Suggestions |
23 | 70 |
|
24 | | -```shell |
25 | | -npm run transfer -- --sourceOwner 26d202d4-f001-7043-1781-77c935224d18 --destinationOwner e6f232b4-6051-7096-a238-5527b8615d11 --environment main |
| 71 | +When executing the migration it's a good idea to also write out the logs to a file. For example |
| 72 | + |
| 73 | +```bash |
| 74 | +npm run plan -- \ |
| 75 | + --environment "main" \ |
| 76 | + --file "./migrations/file.json" \ |
| 77 | + --dryRun "false" >> migration.logs.txt |
26 | 78 | ``` |
| 79 | + |
| 80 | +Then upload this log file to S3. This should then keep a decent record of _what_ happened during a migration. |
0 commit comments