-
Notifications
You must be signed in to change notification settings - Fork 61
feat: Add JSON logging capability to integrated_channels #2460
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
22444a6 to
de952bc
Compare
b6ad6d5 to
0b6800a
Compare
pwnage101
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please do not add a layer of abstraction to functionality which already supports pluggable handlers. Using LOGGER.*() function calls directly inside of production code is also more pythonic, IMO.
|
Hey @pwnage101, Thank you for reviewing the PR I absolutely agree with the approach you proposed. Let me refactor my code. |
1a480f7 to
235515d
Compare
pwnage101
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
|
FYI, our current datadog log parsing configuration does NOT support json parsing. You need to come up with a proposal to add json parsing to the Datadog GROK Parser config in a way that is still backwards-compatible. For example, just changing
|
|
I minimally tested this approach and it seems to work, but I highly recommend you do your own testing to validate the approach for more scenarios: -_message_with_keyvalue %{regex("(?<message>.*)"):message_keys:keyvalue}
+_message_with_keyvalue (%{regex(".*(?= \\{.*\\})"):message} %{data:json_keys:json}|%{regex("(?<message>.*)"):message_keys:keyvalue})This matches existing messages optionally with keyvalue pairs, but if the message ends with JSON, then it will switch to that pattern which uses the Please perform testing on the actual Grok parser edit page (but do not hit save until you get peer review on the new regex): https://app.datadoghq.com/logs/pipelines/pipeline/of_y4sdqStWBHmooL2W3UQ/processors/edit/Wz-1KN_ISsyavH79U5dmgg |
|
This one is even better, IMO: Supports log lines with only a message: And also supports log lines with extra structured data: |
|
Ignore my latest comments, I did not realize the entire log line would be JSON, and would trigger the JSON preprocessing step in the datadog logs pipeline. |


This PR implements enhanced structured logging utilities for Integrated Channels to improve Datadog monitoring and alerting, addressing log level inconsistencies and poor structured data parsing identified in ENT-8024.
Merge checklist:
requirements/*.txtfiles)base.inif needed in production but edx-platform doesn't install ittest-master.inif edx-platform pins it, with a matching versionmake upgrade && make requirementshave been run to regenerate requirementsmake statichas been run to update webpack bundling if any static content was updated./manage.py makemigrationshas been run./manage.py lms makemigrationsin the shell.Post merge:
(so basically once your build finishes, after maybe a minute you should see the new version in PyPi automatically (on refresh))
make upgradein edx-platform will look for the latest version in PyPi.