Commit 6c7b924
[Node] Adding a custom log (#421)
*Issue description:*
We need a reliable way to validate SigV4 logs across different language
implementations (Python, JavaScript, .NET, Java) for our sample
applications. The current logging doesn't provide a consistent, easily
identifiable log entry for this purpose. This PR adds a standardized
SigV4 logging mechanism to the Javascript sample application, similar to
what was implemented for Python in [PR
[#406]](#406)
and Java in [PR
[#411]](#411)
*Description of changes:*
Added a custom WARNING log in the aws_sdk_call function of the Python
sample application. This custom log will be present in the log-group and
can be used to filter logs from those created while running the sample
application. This change provides a consistent log entry that can be
replicated across other language implementations for uniform SigV4 log
validation.
*Rollback procedure:*
1. Remove the line logger.warn("This is a custom log for validation
testing"); from the aws_sdk_call function in index.js of frontend
service.
2. Commit and push the change.
3. Redeploy the application.
Testing:
For testing, used a sample app and instrument it with adot js to
visualize if the new custom log appears in cloudwatch console. Here is
the screenshot of the log from cloudwatch console.
<img width="1400" alt="Screenshot 2025-07-03 at 10 19 03 AM"
src="https://github.com/user-attachments/assets/97e845ed-2856-471a-b4ae-8125f40ad063"
/>
<img width="1404" alt="Screenshot 2025-07-03 at 10 19 12 AM"
src="https://github.com/user-attachments/assets/6f26b52a-3415-47d3-9c17-74f801b3b219"
/>
<Can we safely revert this commit if needed? If not, detail what must be
done to safely revert and why it is needed.>
*Ensure you've run the following tests on your changes and include the
link below:*
To do so, create a `test.yml` file with `name: Test` and workflow
description to test your changes, then remove the file for your PR. Link
your test run in your PR description. This process is a short term
solution while we work on creating a staging environment for testing.
NOTE: TESTS RUNNING ON A SINGLE EKS CLUSTER CANNOT BE RUN IN PARALLEL.
See the
[needs](https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idneeds)
keyword to run tests in succession.
- Run Java EKS on `e2e-playground` in us-east-1 and eu-central-2
- Run Python EKS on `e2e-playground` in us-east-1 and eu-central-2
- Run metric limiter on EKS cluster `e2e-playground` in us-east-1 and
eu-central-2
- Run EC2 tests in all regions
- Run K8s on a separate K8s cluster (check IAD test account for master
node endpoints; these will change as we create and destroy clusters for
OS patching)
By submitting this pull request, I confirm that my contribution is made
under the terms of the Apache 2.0 license.
Co-authored-by: Jeel Mehta <[email protected]>1 parent cdf9895 commit 6c7b924
1 file changed
+5
-0
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
43 | 43 | | |
44 | 44 | | |
45 | 45 | | |
| 46 | + | |
| 47 | + | |
| 48 | + | |
| 49 | + | |
| 50 | + | |
46 | 51 | | |
47 | 52 | | |
48 | 53 | | |
| |||
0 commit comments