Skip to content

Conversation

@mahdirahimi1999
Copy link
Contributor

This PR introduces a security hardening measure to protect against recursion-based Denial of Service (DoS) attacks. By implementing a max_depth constraint, the framework can now prevent resource exhaustion caused by maliciously crafted, deeply nested JSON payloads.

Problem:

Structured fields like ListField and DictField were previously vulnerable to deep nesting attacks, potentially leading to excessive CPU consumption or stack overflow, especially when using recursive serializers or nested schemas.

Solution:

  • Dual-Layer Protection: The max_depth limit validates both the field schema and the raw input data. This ensures rejection of deep payloads even if a field is defined without explicit children.
  • Efficient Depth Tracking: Implemented a non-breaking depth propagation mechanism using the internal bind() lifecycle, ensuring zero overhead for standard flat validations.
  • Early Exit: Data inspection terminates immediately upon reaching the depth threshold, preventing unnecessary traversal of the remaining payload.

Usage Example:

from rest_framework import serializers

# 1. Protecting nested schemas (List/Dict/Serializer)
class NestedSerializer(serializers.Serializer):
    data = serializers.ListField(
        child=serializers.DictField(),
        max_depth=5  # Enforces limit across the entire subtree
    )

# 2. Structural data protection (Protects even flat definitions)
field = serializers.ListField(max_depth=2)

# This will raise a ValidationError due to data depth, 
# preventing deep recursion during processing.
field.run_validation([[[[1, 2]]]])

Changes:

  • Added max_depth parameter to ListField and DictField.
  • Enhanced BaseSerializer to support cross-boundary depth propagation.
  • Added comprehensive test suite covering nested serializers, mixed containers, and flat field data-inspection.
  • Updated API documentation and localized error messages.

Impact:

  • Security: Hardens the framework against a common class of resource exhaustion attacks.
  • Compatibility: Zero breaking changes; all existing public API signatures remain intact.

@mahdirahimi1999 mahdirahimi1999 force-pushed the fix/prevent-recursion-limit-fields branch from 20bbf0d to acc3fd7 Compare December 30, 2025 14:24
* `allow_empty` - Designates if empty lists are allowed.
* `min_length` - Validates that the list contains no fewer than this number of elements.
* `max_length` - Validates that the list contains no more than this number of elements.
* `max_depth` - Validates that nesting depth does not exceed this value. This applies to both the field schema depth and the raw input data depth. Depth of 0 permits the field itself but no nesting. Defaults to `None` (no limit).
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Imo here we should also add warning about the Dos risk if no max_depth were provided.

A field class that validates a list of objects.

**Signature**: `ListField(child=<A_FIELD_INSTANCE>, allow_empty=True, min_length=None, max_length=None)`
**Signature**: `ListField(child=<A_FIELD_INSTANCE>, allow_empty=True, min_length=None, max_length=None, max_depth=None)`
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you think adding a default limit for max_depth would be a good idea? Users could still override it if needed, but having a cushion would help even if they don't set a limit. What’s your take on that?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I considered that, but a default limit might break existing setups that legitimately use deep nesting. Keeping it None by default feels safer for backward compatibility. That said, we should definitely highlight it in the docs as a security best practice.

@auvipy auvipy self-requested a review December 31, 2025 06:19
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR introduces a security hardening feature to mitigate recursion-based Denial of Service (DoS) attacks by adding a max_depth parameter to ListField, DictField, and serializer classes. The implementation provides dual-layer protection: validating both the field schema depth (nested field definitions) and the raw input data depth to prevent resource exhaustion from deeply nested JSON payloads.

Key Changes

  • Added max_depth parameter to ListField and DictField with depth tracking and propagation mechanisms
  • Extended BaseSerializer and ListSerializer to support max_depth and cross-boundary depth propagation
  • Added comprehensive test coverage for various depth scenarios including nested serializers, mixed containers, and edge cases

Reviewed changes

Copilot reviewed 5 out of 5 changed files in this pull request and generated 5 comments.

Show a summary per file
File Description
tests/test_fields.py Added 330 lines of comprehensive test coverage for max_depth validation across ListField, DictField, serializers, and various nesting scenarios
rest_framework/serializers.py Modified BaseSerializer and ListSerializer to support max_depth parameter, depth tracking, propagation, and data validation logic
rest_framework/fields.py Added max_depth parameter and validation logic to ListField and DictField with depth tracking and propagation mechanisms
rest_framework/locale/en/LC_MESSAGES/django.po Added error message translations for depth limit violations in ListField and DictField, plus minor whitespace cleanup
docs/api-guide/fields.md Updated API documentation for ListField and DictField to document the new max_depth parameter with clear usage examples

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +1695 to +1702
def _check_data_depth(self, data, current_level):
items = data.values() if isinstance(data, dict) else data
for item in items:
if isinstance(item, (list, tuple, dict)):
next_level = current_level + 1
if next_level > self._root_max_depth:
self.fail('max_depth', max_depth=self._root_max_depth)
self._check_data_depth(item, next_level)
Copy link

Copilot AI Jan 3, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is significant code duplication in the _check_data_depth method implementations across BaseSerializer (lines 145-169), ListSerializer (lines 710-734), ListField (lines 1695-1702), and DictField (lines 1799-1806). These implementations all perform similar recursive depth checking with the same validation logic, but use different error handling approaches (some use self.fail(), others construct ValidationError directly).

This duplication makes the code harder to maintain and increases the risk of inconsistencies. Consider extracting this logic into a shared utility function or mixin method that can be reused across all these classes.

Copilot uses AI. Check for mistakes.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The duplication is intentional. Fields use self.fail() for error handling, while Serializers construct ValidationError directly - they follow different API patterns.

Extracting this would need conditionals for different error handling, reducing clarity for 8 lines of straightforward logic. The architectural separation between Fields and Serializers is worth maintaining here.

mahdirahimi1999 and others added 2 commits January 5, 2026 20:23
Add condition check to prevent overriding explicit max_depth values.
Includes 8 new test cases.
@auvipy auvipy requested a review from peterthomassen January 6, 2026 12:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants