Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/_deploy.yml
Original file line number Diff line number Diff line change
Expand Up @@ -152,7 +152,7 @@ jobs:
- uses: actions/checkout@v4
with:
ref: ${{ needs.get-branch-from-workflow-file.outputs.branch_name }}
- if: ${{ env.SCOPE == 'per_workspace' && env.ACCOUNT == 'int' }}
- if: ${{ env.SCOPE == 'per_workspace' && (env.ACCOUNT == 'int' || env.ACCOUNT == 'prod') }}
uses: ./.github/actions/make/
with:
command: swagger--publish
Expand Down
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ repos:
rev: v1.4.0
hooks:
- id: detect-secrets
exclude: ".pre-commit-config.yaml|infrastructure/localstack/provider.tf|src/etl/sds/tests/changelog|src/etl/sds/worker/bulk/transform_bulk/tests|src/etl/sds/worker/bulk/tests/stage_data|src/api/tests/smoke_tests/test_smoke.py"
exclude: ".pre-commit-config.yaml|infrastructure/localstack/provider.tf|src/etl/sds/tests/changelog|src/etl/sds/worker/bulk/transform_bulk/tests|src/etl/sds/worker/bulk/tests/stage_data|src/api/tests/smoke_tests/test_smoke.py|archived_epr/src_old/api/tests/smoke_tests/test_smoke.py"

- repo: https://github.com/prettier/pre-commit
rev: 57f39166b5a5a504d6808b87ab98d41ebf095b46
Expand Down
4 changes: 4 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,10 @@
- [PI-754] Search Product
- Dependabot: datamodel-code-generator

## 2025-02-21-a
- [PI-792] Remove EPR & SDS lambdas
- [PI-816] Deploy PROD API Spec

## 2025-02-20
- [PI-790] Remove EPR from swagger

Expand Down
10 changes: 9 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@

### Prerequisites

We use `asdf` to fetch the required versions of prerequisite libraries instead of your system's default version. To get it up and running go to https://asdf-vm.com/guide/getting-started.html. You can check it installed properly by using the command `asdf --version`.
We use `asdf` to fetch the required versions of prerequisite libraries instead of your system's default version. To get it up and running go to <https://asdf-vm.com/guide/getting-started.html>. You can check it installed properly by using the command `asdf --version`.

However, you will also need to install the `docker engine` separately

Expand Down Expand Up @@ -469,3 +469,11 @@ To run the SBOM commands there are some make commands that currently handle this

`make generate--sbom`
`make validate--sbom`

## Extras

### Archive

The project originally was designed to have a concept of an EPRv2. This did not work out but we have kept the remains of the EPR work in an archive folder located in the `root/archived_epr`. The EPR code was supposed to fit into the structure of our existing CPM model but it became apparent as requirements came through that this would not be possible. You will find in this folder `swagger/OAS spec`, `lambdas`, `ETL` and `tests`.

This has been left in for future reference. The code would need to be transferred back into the root of the project, changing `src_old` back to `src` and merging into the existing `src` directory.
2 changes: 1 addition & 1 deletion VERSION
Original file line number Diff line number Diff line change
@@ -1 +1 @@
2025.02.21
2025.02.21.a
Original file line number Diff line number Diff line change
Expand Up @@ -21,32 +21,33 @@ def get_endpoint_lambda_mapping() -> ENDPOINT_LAMBDA_MAPPING:
query_parameters = {'value': something}
"""

import api.createCpmProduct.index
import api.createDevice.index
import api.createDeviceAccreditedSystem.index
import api.createDeviceMessageHandlingSystem.index
import api.createDeviceReferenceData.index
import api.createDeviceReferenceDataASActions.index
import api.createDeviceReferenceDataMessageSet.index
import api.createEprProduct.index
import api.createProductTeam.index
import api.createProductTeamEpr.index
import api.deleteCpmProduct.index
import api.deleteEprProduct.index
import api.deleteProductTeam.index
import api.readCpmProduct.index
import api.readDevice.index
import api.readDeviceReferenceData.index

# import api.searchCpmProduct.index
import api.readEprProduct.index
import api.readProductTeam.index
import api.readProductTeamEpr.index
import api.readQuestionnaire.index
import api.searchDeviceReferenceData.index
import api.searchEprProduct.index
import api.searchSdsDevice.index
import api.searchSdsEndpoint.index

import api.createCpmProduct.index
import api.createProductTeam.index
import api.deleteCpmProduct.index
import api.deleteProductTeam.index
import api.readCpmProduct.index
import api.readProductTeam.index
import api.status.index

return {
Expand Down
2 changes: 2 additions & 0 deletions changelog/2025-02-21-a.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
- [PI-792] Remove EPR & SDS lambdas
- [PI-816] Deploy PROD API Spec
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[tool.poetry]
name = "connecting-party-manager"
version = "2025.02.21"
version = "2025.02.21.a"
description = "Repository for the Connecting Party Manager API and related services"
authors = ["NHS England"]
license = "LICENSE.md"
Expand Down
4 changes: 2 additions & 2 deletions scripts/builder/build.mk
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ POSTMAN_COLLECTION = $(CURDIR)/src/api/tests/feature_tests/postman-collection.js
TOOL_VERSIONS_COPY = $(TIMESTAMP_DIR)/tool-versions.copy
POETRY_LOCK = $(CURDIR)/poetry.lock
INIT_TIMESTAMP = $(CURDIR)/.timestamp/init.timestamp
SRC_FILES = $(shell find src -type f -name "*.py" -not -path "*/feature_tests/*" -not -path "*/test_*" -not -path "*/fhir/r4/strict_models.py" -not -path "*/fhir/r4/models.py")
SRC_FILES = $(shell find src/api src/etl src/layers src/test_helpers -type f -name "*.py" -not -path "*/feature_tests/*" -not -path "*/test_*" -not -path "*/fhir/r4/strict_models.py" -not -path "*/fhir/r4/models.py" -not -path "*/archived_epr/*")
THIRD_PARTY_DIST = $(CURDIR)/src/layers/third_party/dist
SWAGGER_DIST = $(CURDIR)/infrastructure/swagger/dist
SWAGGER_PUBLIC = $(SWAGGER_DIST)/public/swagger.yaml
Expand All @@ -32,7 +32,7 @@ clean--build:
build: $(BUILD_TIMESTAMP) ## Complete project install and build artifacts for deployment

$(BUILD_TIMESTAMP): $(BUILD_DEPENDENCIES)
@find $(CURDIR) -name make.py | xargs -n 1 -P 8 -I % bash -c 'poetry run python %'
@find $(CURDIR)/src -name make.py | xargs -n 1 -P 8 -I % bash -c 'poetry run python %'
touch $(BUILD_TIMESTAMP)

generate--sbom: build
Expand Down
2 changes: 2 additions & 0 deletions scripts/builder/common.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,10 +9,12 @@
DIST_DIR = "dist"
BUILD_DIR = "build"
SCRIPT_DIR = "make"
ARCHIVE_DIR = "archived_epr"
UNNECESSARY_DIRS = [
DIST_DIR,
BUILD_DIR,
SCRIPT_DIR,
ARCHIVE_DIR,
"setup.py",
"__pycache__",
"*.dist-info",
Expand Down
13 changes: 7 additions & 6 deletions scripts/builder/lambda_build.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,10 +34,11 @@ def create_zip_package(


def build(file):
lambda_base_dir = get_base_dir(file)
package_name = lambda_base_dir.name
if "archived_epr" not in file:
lambda_base_dir = get_base_dir(file)
package_name = lambda_base_dir.name

with create_zip_package(
package_name=package_name, base_dir=lambda_base_dir
) as build_dir:
copy_source_code(source_dir=lambda_base_dir, build_dir=build_dir)
with create_zip_package(
package_name=package_name, base_dir=lambda_base_dir
) as build_dir:
copy_source_code(source_dir=lambda_base_dir, build_dir=build_dir)
13 changes: 7 additions & 6 deletions scripts/builder/layer_build.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,13 +47,14 @@ def create_zip_package(


def build(file):
layer_base_dir = get_base_dir(file)
package_name = layer_base_dir.name
if "archived_epr" not in file:
layer_base_dir = get_base_dir(file)
package_name = layer_base_dir.name

with create_zip_package(
package_name=package_name, base_dir=layer_base_dir
) as build_dir:
copy_source_code(source_dir=layer_base_dir, build_dir=build_dir)
with create_zip_package(
package_name=package_name, base_dir=layer_base_dir
) as build_dir:
copy_source_code(source_dir=layer_base_dir, build_dir=build_dir)


@contextmanager
Expand Down
35 changes: 18 additions & 17 deletions scripts/builder/third_party_build.py
Original file line number Diff line number Diff line change
Expand Up @@ -98,21 +98,22 @@ def build_third_party(
group: str,
dependencies: dict[str, str],
):
base_dir = get_base_dir(file)
package_zipper = create_zip_package(
package_name=f"third_party_{group}", base_dir=base_dir, third_party=True
)
docker_file = get_dockerfile_path(base_dir=base_dir, group=group)
if "archived_epr" not in file:
base_dir = get_base_dir(file)
package_zipper = create_zip_package(
package_name=f"third_party_{group}", base_dir=base_dir, third_party=True
)
docker_file = get_dockerfile_path(base_dir=base_dir, group=group)

with TemporaryDirectory() as root_dir, package_zipper as build_dir:
root_dir = Path(root_dir).resolve()
venv_dir = root_dir / VENV
with create_temp_path(path=venv_dir, is_dir=True):
create_requirements(
root_dir=root_dir,
pyproject_toml_path=pyproject_toml_path,
group=group,
dependencies=dependencies,
)
docker_run(docker_file=docker_file, root_dir=root_dir, group=group)
copy_source_code(source_dir=venv_dir, build_dir=build_dir)
with TemporaryDirectory() as root_dir, package_zipper as build_dir:
root_dir = Path(root_dir).resolve()
venv_dir = root_dir / VENV
with create_temp_path(path=venv_dir, is_dir=True):
create_requirements(
root_dir=root_dir,
pyproject_toml_path=pyproject_toml_path,
group=group,
dependencies=dependencies,
)
docker_run(docker_file=docker_file, root_dir=root_dir, group=group)
copy_source_code(source_dir=venv_dir, build_dir=build_dir)
4 changes: 0 additions & 4 deletions scripts/infrastructure/apigee/proxygen.sh
Original file line number Diff line number Diff line change
Expand Up @@ -192,12 +192,8 @@ function publish_swagger(){

if [[ ${_aws_environment} == "prod" ]]; then
_flags=""
#elif [[ ${_aws_environment} == "int" ]]; then
else
_flags="--uat"
# else
# echo "ERROR: only environments to deploy to are 'prod' and 'int'"
# exit 1;
fi

echo "
Expand Down
14 changes: 7 additions & 7 deletions scripts/test/test.mk
Original file line number Diff line number Diff line change
Expand Up @@ -10,25 +10,25 @@ RUN_SPEEDTEST = ?= FALSE
PROXYGEN_PRODUCT_TIMESTAMP = $(TIMESTAMP_DIR)/.proxygen-product.stamp

_pytest:
AWS_DEFAULT_REGION=$(AWS_DEFAULT_REGION) AWS_ACCESS_KEY_ID=$(AWS_ACCESS_KEY_ID) AWS_SECRET_ACCESS_KEY=$(AWS_SECRET_ACCESS_KEY) AWS_SESSION_TOKEN=$(AWS_SESSION_TOKEN) poetry run python -m pytest $(PYTEST_FLAGS) $(_INTERNAL_FLAGS) $(_CACHE_CLEAR)
AWS_DEFAULT_REGION=$(AWS_DEFAULT_REGION) AWS_ACCESS_KEY_ID=$(AWS_ACCESS_KEY_ID) AWS_SECRET_ACCESS_KEY=$(AWS_SECRET_ACCESS_KEY) AWS_SESSION_TOKEN=$(AWS_SESSION_TOKEN) poetry run python -m pytest $(PYTEST_FLAGS) --ignore=archived_epr $(_INTERNAL_FLAGS) $(_CACHE_CLEAR)

_behave:
AWS_DEFAULT_REGION=$(AWS_DEFAULT_REGION) AWS_ACCESS_KEY_ID=$(AWS_ACCESS_KEY_ID) AWS_SECRET_ACCESS_KEY=$(AWS_SECRET_ACCESS_KEY) AWS_SESSION_TOKEN=$(AWS_SESSION_TOKEN) poetry run python -m behave src/api/tests/feature_tests $(BEHAVE_FLAGS) $(_INTERNAL_FLAGS) --no-skipped

test--unit: ## Run unit (pytest) tests
$(MAKE) _pytest _INTERNAL_FLAGS="-m 'unit' --ignore=archived_epr $(_INTERNAL_FLAGS)" _CACHE_CLEAR=$(_CACHE_CLEAR)
$(MAKE) _pytest _INTERNAL_FLAGS="-m 'unit' $(_INTERNAL_FLAGS)" _CACHE_CLEAR=$(_CACHE_CLEAR)

test--integration: aws--login ## Run integration (pytest) tests
$(MAKE) _pytest _INTERNAL_FLAGS="-m 'integration' --ignore=archived_epr $(_INTERNAL_FLAGS)" _CACHE_CLEAR=$(_CACHE_CLEAR) AWS_ACCESS_KEY_ID=$(AWS_ACCESS_KEY_ID) AWS_SECRET_ACCESS_KEY=$(AWS_SECRET_ACCESS_KEY) AWS_SESSION_TOKEN=$(AWS_SESSION_TOKEN)
$(MAKE) _pytest _INTERNAL_FLAGS="-m 'integration' $(_INTERNAL_FLAGS)" _CACHE_CLEAR=$(_CACHE_CLEAR) AWS_ACCESS_KEY_ID=$(AWS_ACCESS_KEY_ID) AWS_SECRET_ACCESS_KEY=$(AWS_SECRET_ACCESS_KEY) AWS_SESSION_TOKEN=$(AWS_SESSION_TOKEN)

test--slow: ## Run slow (pytest) tests
$(MAKE) _pytest _INTERNAL_FLAGS="-m 'slow' --ignore=archived_epr" _CACHE_CLEAR=$(_CACHE_CLEAR)
$(MAKE) _pytest _INTERNAL_FLAGS="-m 'slow'" _CACHE_CLEAR=$(_CACHE_CLEAR)

test--s3: aws--login ## Run (pytest) tests that require s3 downloads
$(MAKE) _pytest _INTERNAL_FLAGS="-m 's3' --ignore=archived_epr $(_INTERNAL_FLAGS)" _CACHE_CLEAR=$(_CACHE_CLEAR) AWS_ACCESS_KEY_ID=$(AWS_ACCESS_KEY_ID) AWS_SECRET_ACCESS_KEY=$(AWS_SECRET_ACCESS_KEY) AWS_SESSION_TOKEN=$(AWS_SESSION_TOKEN)
$(MAKE) _pytest _INTERNAL_FLAGS="-m 's3' $(_INTERNAL_FLAGS)" _CACHE_CLEAR=$(_CACHE_CLEAR) AWS_ACCESS_KEY_ID=$(AWS_ACCESS_KEY_ID) AWS_SECRET_ACCESS_KEY=$(AWS_SECRET_ACCESS_KEY) AWS_SESSION_TOKEN=$(AWS_SESSION_TOKEN)

test--smoke: aws--login ## Run end-to-end smoke tests (pytest)
AWS_DEFAULT_REGION=$(AWS_DEFAULT_REGION) AWS_ACCESS_KEY_ID=$(AWS_ACCESS_KEY_ID) AWS_SECRET_ACCESS_KEY=$(AWS_SECRET_ACCESS_KEY) AWS_SESSION_TOKEN=$(AWS_SESSION_TOKEN) WORKSPACE=$(WORKSPACE) ACCOUNT=$(ACCOUNT) poetry run python -m pytest $(PYTEST_FLAGS) -m 'smoke' --ignore=src/layers --ignore=src/etl --ignore=archived_epr/src $(_CACHE_CLEAR)
AWS_DEFAULT_REGION=$(AWS_DEFAULT_REGION) AWS_ACCESS_KEY_ID=$(AWS_ACCESS_KEY_ID) AWS_SECRET_ACCESS_KEY=$(AWS_SECRET_ACCESS_KEY) AWS_SESSION_TOKEN=$(AWS_SESSION_TOKEN) WORKSPACE=$(WORKSPACE) ACCOUNT=$(ACCOUNT) poetry run python -m pytest $(PYTEST_FLAGS) -m 'smoke' --ignore=src/layers --ignore=src/etl --ignore=archived_epr $(_CACHE_CLEAR)

test--%--rerun: ## Rerun failed integration or unit (pytest) tests
$(MAKE) test--$* _INTERNAL_FLAGS="--last-failed --last-failed-no-failures none" _CACHE_CLEAR=$(_CACHE_CLEAR)
Expand All @@ -42,4 +42,4 @@ test--feature--%--auto-retry: ## Autoretry of failed feature (gherkin) tests
$(MAKE) test--feature--$* _INTERNAL_FLAGS="--define='auto_retry=true'"

test--sds--matrix: ## Run end-to-end smoke tests that check data matches betweeen cpm and ldap
SDS_PROD_APIKEY=$(SDS_PROD_APIKEY) SDS_DEV_APIKEY=$(SDS_DEV_APIKEY) USE_CPM_PROD=$(USE_CPM_PROD) TEST_COUNT=$(TEST_COUNT) COMPARISON_ENV=$(COMPARISON_ENV) RUN_SPEEDTEST=$(RUN_SPEEDTEST) poetry run python -m pytest $(PYTEST_FLAGS) -m 'matrix' --ignore=src/layers --ignore=src/etl $(_CACHE_CLEAR)
SDS_PROD_APIKEY=$(SDS_PROD_APIKEY) SDS_DEV_APIKEY=$(SDS_DEV_APIKEY) USE_CPM_PROD=$(USE_CPM_PROD) TEST_COUNT=$(TEST_COUNT) COMPARISON_ENV=$(COMPARISON_ENV) RUN_SPEEDTEST=$(RUN_SPEEDTEST) poetry run python -m pytest $(PYTEST_FLAGS) -m 'matrix' --ignore=src/layers --ignore=src/etl --ignore=archived_epr $(_CACHE_CLEAR)
Loading