You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: tests/performance/README.md
+42-8Lines changed: 42 additions & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,20 +1,44 @@
1
1
# Performance Testing
2
2
3
-
some high level context short
3
+
We have performance tests which give us a benchmark of how NRLF performs under load for consumers and producers.
4
4
5
-
## Run perf tests
5
+
## Run performance tests
6
6
7
7
### Prep the environment
8
8
9
9
Perf tests are generally conducted in the perftest env. There's a selection of tables in the perftest env representing different pointer volume scenarios e.g. perftest-baseline vs perftest-1million (todo: update with real names!).
10
10
11
-
To reset this table to the expected state for perftests, restore the table from a backup.
11
+
#### Point perftest at a different pointers table
12
12
13
-
In the steps below, make sure the table name is the table your environment is pointing at. You might need to redeploy NRLF lambdas to point at the desired table.
13
+
We (will) have multiple tables representing different states of NRLF in the future e.g. all patients receiving an IPS (International Patient Summary), onboarding particular high-volume suppliers.
14
+
15
+
In order to run performance tests to get figures for these different states, we can point the perftest environment at one of these tables.
16
+
17
+
Currently, this requires tearing down the existing environment and restoring from scratch:
18
+
19
+
1. Follow instructions in terraform/infrastructure/readme.md to tear down the perf test environment.
20
+
- Do **not** tear down shared account-wide infrastructure
21
+
2. Update `perftest-pointers-table.name_prefix` in `terraform/account-wide-infrastructure/test/dynamodb__pointers-table.tf` to be the table name you want, minus "-pointers-table"
22
+
- e.g. to use the baseline table `nhsd-nrlf--perftest-baseline-pointers-table`, set `name_prefix = "nhsd-nrlf--perftest-baseline"`
23
+
3. Update `dynamodb_pointers_table_prefix` in `terraform/infrastructure/etc/perftest.tfvars` same as above.
24
+
- e.g. to use the baseline table `dynamodb_pointers_table_prefix = "nhsd-nrlf--perftest-baseline"`
25
+
4. Commit changes to a branch & push
26
+
5. Run the [Deploy Account-wide infrastructure](https://github.com/NHSDigital/NRLF/actions/workflows/deploy-account-wide-infra.yml) workflow against your branch & `account-test`.
27
+
- If you get a terraform failure like "tried to create table but it already exists", you will need to do some fanangaling:
28
+
1. make sure there is a backup of your chosen table or create one if not. In the AWS console: dynamodb > tables > your perftest table > backups > create backup > Create on-demand backup > leave all settings as defaults > create backup. This might take up to an hour to complete.
29
+
2. once backed up, delete your table. In the AWS console: dynamodb > tables > your perftest table > actions > delete table
30
+
3. Rerun the Deploy Account-wide infrastructure action.
31
+
4. Terraform will create an empty table with the correct name & (most importantly!) read/write IAM policies.
32
+
5. Delete the empty table created by terraform and restore from the backup, specifying the same table name you've defined in code.
33
+
6. Run the [Persistent Environment Deploy](https://github.com/NHSDigital/NRLF/actions/workflows/persistent-environment.yml) workflow against your branch & `perftest` to restore the environment with lambdas pointed at your chosen table.
34
+
7. You can check this has been successful by checking the table name in the lambdas.
35
+
- In the AWS console: Lambda > functions > pick any perftest-1 lambda > Configuration > Environment variables > `TABLE_NAME` should be your desired pointer table e.g. `nhsd-nrlf--perftest-baseline-pointers-table`
36
+
37
+
If you've followed these steps, you will also need to [generate permissions](#generate-permissions) as the organisation permissions will have been lost when the environment was torn down.
14
38
15
39
### Prepare to run tests
16
40
17
-
#### Pull certs for env
41
+
#### Pull certs for perftest
18
42
19
43
```sh
20
44
assume management
@@ -26,14 +50,14 @@ make truststore-pull-all ENV=perftest
26
50
You will need to generate pointer permissions the first time performance tests are run in an environment e.g. if the perftest environment is destroyed & recreated.
27
51
28
52
```sh
29
-
make generate permissions # makes a bunch of json permission files
53
+
make generate permissions # makes a bunch of json permission files for test organisations
30
54
make build # will take all permissions & create nrlf_permissions.zip file
31
55
32
56
# apply this new permissions zip file to your environment
33
57
cd ./terraform/infrastructure
34
-
assume test# needed?
58
+
assume test
35
59
make init TF_WORKSPACE_NAME=perftest-1 ENV=perftest
36
-
tf apply
60
+
make ENV=perftest USE_SHARED_RESOURCES=true apply
37
61
```
38
62
39
63
#### Generate input files
@@ -49,3 +73,13 @@ make perftest-prepare PERFTEST_TABLE_NAME=perftest-baseline
49
73
make perftest-consumer ENV_TYPE=perftest PERFTEST_HOST=perftest-1.perftest.record-locator.national.nhs.uk
50
74
make perftest-producer ENV_TYPE=perftest PERFTEST_HOST=perftest-1.perftest.record-locator.national.nhs.uk
51
75
```
76
+
77
+
## Assumptions / Caveats
78
+
79
+
- Run performance tests in the perftest environment only\*
80
+
- Both producer & consumer tests are repeatable
81
+
- These tests work on the assumption that all nhs numbers in the test data are serial and lie within a fixed range i.e. picking any number between NHS_NUMBER_MINIMUM & NHS_NUMBER_MAXIMUM will yield a patient with pointer(s).
82
+
- Configure scenarios in the `consumer/perftest.config.json` & `producer/perftest.config.json` files. This does not alter the number of stages per scenario, that's fixed in `perftest.js`.
83
+
- Consider running these tests multiple times to get figures for a warm environment - perftest, unlike prod, is not well-used so you will get cold-start figures on your first run
84
+
85
+
\*These performance tests are tightly coupled to the seed scripts that populate test data. This means these tests can only be run in an environment containing solely test data created by the seed data scripts. `perftest` is a dedicated environment to do this in, but in theory any environment could be populated with the seed data and used.
0 commit comments