Skip to content

Commit 60c990d

Browse files
committed
Add notes on the project structure
1 parent 7f9f675 commit 60c990d

File tree

1 file changed

+22
-0
lines changed

1 file changed

+22
-0
lines changed

README.md

Lines changed: 22 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -48,3 +48,25 @@ You can use `utils/simulate_events.py` to simulate events and test the cloudform
4848
- Now run `uv run python utils/simulate_events.py --stream-name fraud-detection-TransactionIngestion-mHXRRNzZAWJV --num-transactions 1000`
4949

5050
The simulate.py pushes transactions events random some with error in the transaction-id format and some in the correct format, you should observe data being pushed into both your valid and invalid s3 buckets.
51+
52+
## Tech project structure
53+
This is a multi sub project mono repo, each lambda function logic is a separate folder in `src` directory and the build workflow builds each of them and pushes to ECR which can then be used in AWS Lambda.
54+
55+
```
56+
├── src
57+
│   ├── fraud_detection_model
58+
│   │   ├── app.py
59+
│   │   ├── Dockerfile
60+
│   │   ├── fraud_detection_model.pkl
61+
│   │   └── requirements.txt
62+
│   └── validation_lambda
63+
│   ├── app.py
64+
│   ├── Dockerfile
65+
│   └── requirements.txt
66+
```
67+
68+
The Serverless infra is managed by the template.yaml file using cloudformation, the best process of maintaining it so far that I have found is,
69+
70+
`Edit in Infrastructure Composer on AWS using the UI -> copy over locally -> Update code and workflow as needed -> test with act -> deploy`
71+
72+
I believe this is not a good way ideally I would want the template to be split into separate folders and load dynamically based on the environment but that was taking too much time to get right and also I would prefer my experience with Terraform over this but due to the time limit I am sticking to the above process.

0 commit comments

Comments
 (0)