PR 6: Add WebSocket and JWT Middleware Unit Tests for Chat Functionality#170
PR 6: Add WebSocket and JWT Middleware Unit Tests for Chat Functionality#170ashley-arellano merged 13 commits intomainfrom
Conversation
This reverts commit 5553b30.
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #170 +/- ##
===========================================
+ Coverage 63.94% 81.47% +17.53%
===========================================
Files 15 17 +2
Lines 624 772 +148
Branches 49 51 +2
===========================================
+ Hits 399 629 +230
+ Misses 204 119 -85
- Partials 21 24 +3 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
- realized that codecov is not showing all the tests made which gives us an incorrect test coverage
|
Please note that the only files that were changed for this specific PR are files test_consumers.py and test_jwt_middleware.py. |
alliekameda
left a comment
There was a problem hiding this comment.
🔍 Peer Review: Great Job on This PR!
Hey Ashley 👋 — just finished reviewing your PR "PR 6: Add WebSocket and JWT Middleware Unit Tests for Chat Functionality #170". Here’s my feedback:
✅ What Looks Great
- The core feature has been implemented cleanly — functionality aligns perfectly with real-time chat requirements.
- Excellent test coverage across edge cases like missing tokens, malformed tokens, and invalid room names.
- Use of
pytest.mark.asyncio,WebsocketCommunicator, and test fixtures shows solid command over async testing practices. - Test cases simulate both success and error paths.
- Token creation and parsing are realistic and secure.
- Edge case handling (e.g., broken token, missing room) is accounted for.
- Assertions are clean, descriptive, and well-scoped.
- All tests passed. There is a non-urgent warning that could possibly be resolved by running
pip install --upgrade djangorestframework, though.
💡 Suggestions for Improvement (if any)
These are ideas for future improvements or things to address before merging.
- In
test_jwt_middleware.py, consider asserting more context in the response scope — e.g., thatuseris attached correctly, not justuser_id. - You could add disconnect and reconnect scenarios to simulate flaky network behavior or user switching tabs — helps ensure full lifecycle robustness.
- Optionally log or print structured debug messages on token validation failure in middleware — this could help pinpoint JWT issues quicker in dev.
- Add tests for unauthorized access attempts (e.g., users trying to connect to a chat room they’re not part of).
🧪 Tested This Feature
I ran the following test steps:
- ✅ Run
pip install -r requirements.txt - ✅ Run
python manage.py migrate - ✅ Run
python manage.py runserver - ✅ In another terminal, run
redis-cli ping - ✅ If no response, run
sudo service redis-server start - ✅ Run
pytest --cov --cov-branch --cov-report=xml
🔄 Next Steps
- Would be cool to see follow-up test cases for disconnect/reconnect, expiring tokens, or multiple tabs/users in a chat.
- If middleware behavior ever becomes more complex, consider isolated middleware unit tests using mocked token decoding for even faster feedback.
🚀 Final Thoughts
This PR sets a solid foundation for reliable, scalable WebSocket functionality with secure JWT handling. Great job Ashley! (˶˃ ᵕ ˂˶) .ᐟ.ᐟ
There was a problem hiding this comment.
🔍 Peer Review: Solid Work on This PR!
Hey Ashley 👋 I just finished reviewing your PR titled "PR 6: Add WebSocket and JWT Middleware Unit Tests for Chat Functionality #170".
Here’s my feedback:
✅ What Looks Great
- Test Coverage: Unit tests for both WebSocket consumer behaviors and JWT middleware are implemented thoroughly and follow expected real-time interaction flows.
- Descriptive Test Cases: All tests are clearly named and scoped. It is easy to follow what each unit test is testing (for example: valid/invalid token handling, typing indicators, seen status).
- Edge Case Handling: JWT middleware is validated under all critical scenarios (valid token, invalid, missing, malformed). Loved the exception coverage.
- Clean & Modular: Fixtures are used properly, and the test logic is DRY and readable.
- The PR description is thorough and detailed, following the team template and thus helps reviewers understand the scope of your additions clearly. How to test PR guide was missing a few steps, which was resolved when looking at Allison's PR review.
- All tests pass, but there's a warning which get's fixed after running
pip install --upgrade djangorestframework(thanks to Allison).
💡 Suggestions for Improvement (if any)
These are ideas for future improvements or things to address before merging.
- Expand Typing Indicator Scenarios: Maybe test typing events between multiple users (ex: user A starts typing, user B receives the typing event).
- Seen Status Ordering: You could test what happens if seen status IDs are sent out of order, or duplicated (does the consumer handle it cleanly)?
- Token Expiry Timing Test (Stretch Goal): Create a token that's already expired and try connecting (to ensure a strict failure).
- A couple of the test methods could benefit from short comments to explain intent. Would help future readers.
🧪 Tested This Feature
I ran the following test steps:
- ✅ Run
pip install -r requirements.txt(so dependencies are installed) - ✅ Run
python manage.py migrate - ✅ Run
python manage.py runserver - ✅ In another terminal, run
redis-cli ping, if no response: runbrew services start redis - ✅ Run
pytest --cov --cov-branch --cov-report=xmlto test unit tests
🔄 Next Steps
- Consider adding tests for: WebSocket disconnect/reconnect, Unauthorized user Connection Attempts, Token expiration mid-session, Invalid Payloads, and Access Control Checks/enforcement.
🚀 Final Thoughts
Great job with this unit test implementation! These tests are critical to make sure that Real-time features like chats are working between users. Furthermore it's important to ensure WebSocket connections work securely with JWT-based authentication and messages behave as expected during different interactions.
Keep up the awesome work!
-- Ethan
5f0a58d
EthanZ23
left a comment
There was a problem hiding this comment.
I see you've resolved the conflicts and merged with main (branch). The changes look good. Approving the PR.



Overview
What does this PR do?
This PR adds comprehensive unit and integration tests for the WebSocket-based chat system and JWT authentication middleware used in real-time communication within the platform. The goal is to ensure secure, reliable, and expected behavior across multiple WebSocket interaction scenarios.
Key Features & Changes
What has been added or changed?
✔ Core Feature Implementation:
Added
test_consumers.pyfor WebSocket interaction testing usingWebsocketCommunicator.Added
test_jwt_middleware.pyfor validating JWT middleware during WebSocket connections.✔ New Components/Files:
Created test class
TestChatWebsocketswith cases for:Successful connection.
Missing chat room handling.
Typing indicator broadcasting.
Message seen status updating.
Created test class
TestJWTAuthMiddlewarecovering:Valid/invalid token handling.
Missing token scenarios.
Exception handling for malformed tokens.
Why This Is Needed
What problem does this solve?
Related Issue
🔗 None
Implementation Details
How was this feature developed?
Used
pytest.mark.asynciofor async testing support.Used Django’s test database and
@pytest.mark.django_db(transaction=True)to isolate DB changes.Employed
channels.testing.WebsocketCommunicatorandApplicationCommunicatorto simulate WebSocket traffic.JWT tokens generated using
RefreshToken.for_userfor real authentication simulation.Design Documentation (If Applicable)
Not applicable
How to Test This PR
Step-by-step testing guide
1️⃣ Ensure all migrations have been applied via
python manage.py migrate2️⃣ Go to the
backendfolder (see ReadMe if unsure how to get there)3️⃣ Do
pytest --cov --cov-branch --cov-report=xml4️⃣ Ensure that
apps/chat/test_jwt_middleware.pyandapps/chat/test_consumers.pypass.Files Added/Modified
📂 List of important files changed in this PR
test_consumers.pytest_jwt_middleware.pyAI Code Usage
🤖 Was AI-generated code used in this PR?
Checklist for Reviewers
Things to check before approving this PR:
Do all test cases pass consistently and cover edge cases?
Are tests clearly named and descriptive of behavior being tested?
Is the JWT middleware behavior properly validated under all scenarios (valid, invalid, missing token)?
Does the WebSocket consumer gracefully handle unknown rooms, typing status, and seen status updates?
Is the test code clean, modular, DRY, and using appropriate fixtures?
Next Steps & Future Enhancements
What improvements can be made after merging?
Final Notes
Thanks for reviewing this PR! : D