Skip to content

Story 10 (Capstone): Create comprehensive enforcement demo #221

@mikelane

Description

@mikelane

Parent Epic

Part of #211 - Full Google Test Size Constraint Enforcement

Summary

This is the FINAL story in the epic. Create a comprehensive demo project that showcases all enforcement features. The demo serves as both a learning resource and a validation that all features work together correctly.

Why a Demo?

  1. Validation - Proves all features work end-to-end
  2. Documentation - Concrete examples are better than abstract docs
  3. Marketing - Compelling demo drives adoption
  4. Onboarding - New users can copy patterns from the demo
  5. Regression testing - Demo serves as integration test suite

Acceptance Criteria

Demo Project Structure

  • Create examples/enforcement-demo/ directory
  • Include pyproject.toml with pytest-test-categories dependency
  • Include comprehensive README.md
  • Include demo script (demo.sh) for running showcase

Demo Test Suite

The demo must include tests that demonstrate:

  • Filesystem violations

    • Test that reads a file (violation)
    • Test that uses tmp_path (passes)
    • Test that uses pyfakefs (passes)
  • Network violations

    • Test that opens socket (violation)
    • Test that uses pytest-httpx mock (passes)
    • Medium test that uses localhost (passes)
    • Large test with external network (passes)
  • Sleep violations

    • Test that calls time.sleep (violation)
    • Test that uses FakeTimer (passes)
    • Test that uses freezegun (passes)
  • Threading violations

    • Test that spawns thread (violation)
    • Test that runs synchronously (passes)
    • Medium test with threads (passes)
  • Subprocess violations

    • Test that calls subprocess.run (violation)
    • Test that mocks subprocess (passes)
    • Medium test with subprocess (passes)

Configuration Examples

  • pyproject.toml with all enforcement options
  • pytest.ini alternative
  • Environment variable examples
  • CI/CD pipeline example (GitHub Actions)

Demo Script

Create demo.sh that:

  • Runs tests in strict mode (shows violations)
  • Runs tests in warn mode (shows warnings)
  • Runs tests in off mode (all pass)
  • Runs with selective constraints
  • Shows before/after migration example

Documentation

  • README.md with:
    • Quick start instructions
    • Feature overview
    • Configuration examples
    • Troubleshooting guide
  • MIGRATION.md showing gradual adoption patterns
  • Screenshots/recordings of terminal output

Technical Design

Project Structure

examples/enforcement-demo/
├── README.md
├── MIGRATION.md
├── pyproject.toml
├── pytest.ini
├── demo.sh
├── .github/
│   └── workflows/
│       └── test.yml
├── src/
│   └── demo_app/
│       ├── __init__.py
│       ├── config.py      # File loading (for filesystem demo)
│       ├── api_client.py  # HTTP client (for network demo)
│       ├── cache.py       # Uses time (for sleep demo)
│       ├── worker.py      # Uses threading (for threading demo)
│       └── shell.py       # Uses subprocess (for subprocess demo)
└── tests/
    ├── conftest.py
    ├── test_filesystem_violations.py
    ├── test_filesystem_solutions.py
    ├── test_network_violations.py
    ├── test_network_solutions.py
    ├── test_sleep_violations.py
    ├── test_sleep_solutions.py
    ├── test_threading_violations.py
    ├── test_threading_solutions.py
    ├── test_subprocess_violations.py
    ├── test_subprocess_solutions.py
    └── test_mixed_sizes.py

Sample pyproject.toml

[project]
name = "enforcement-demo"
version = "1.0.0"
description = "Demo project for pytest-test-categories enforcement features"
requires-python = ">=3.11"
dependencies = []

[project.optional-dependencies]
dev = [
    "pytest>=8.0",
    "pytest-test-categories>=1.3.0",
    "pytest-httpx",
    "pyfakefs",
    "freezegun",
    "responses",
]

[tool.pytest.ini_options]
testpaths = ["tests"]
markers = [
    "small: Fast unit tests without external dependencies",
    "medium: Integration tests with local services",
    "large: Full system tests with external services",
]

[tool.pytest-test-categories.enforcement]
mode = "strict"
filesystem = true
network = true
sleep = true
threading = true
subprocess = true

Sample Violation Test

# tests/test_filesystem_violations.py
"""
Filesystem violation examples.

These tests demonstrate what happens when small tests attempt
filesystem access. Run with different modes to see behavior:

  pytest --tc-strict  # Tests fail with FilesystemAccessViolationError
  pytest --tc-warn    # Tests pass with warnings
  pytest --tc-off     # Tests pass silently
"""

import pytest


class DescribeFilesystemViolations:
    """Tests that violate filesystem hermeticity."""

    @pytest.mark.small
    def it_fails_when_reading_files(self, tmp_path):
        """Small test cannot read from filesystem (outside tmp_path)."""
        # This violates hermeticity - reading arbitrary file
        with open('/etc/hosts') as f:
            content = f.read()
        
        assert 'localhost' in content

    @pytest.mark.small
    def it_fails_when_writing_files(self):
        """Small test cannot write to filesystem."""
        # This violates hermeticity - writing to CWD
        with open('output.txt', 'w') as f:
            f.write('test data')

    @pytest.mark.small
    def it_fails_with_pathlib(self):
        """Small test cannot use pathlib for real paths."""
        from pathlib import Path
        
        # This violates hermeticity
        config = Path('/etc/app/config.yaml')
        if config.exists():
            content = config.read_text()

Sample Solution Test

# tests/test_filesystem_solutions.py
"""
Filesystem solutions - how to write hermetic tests.

These tests demonstrate proper patterns for tests that need
file operations without violating hermeticity.
"""

import pytest
from pathlib import Path


class DescribeFilesystemSolutions:
    """Hermetic alternatives to filesystem access."""

    @pytest.mark.small
    def it_uses_tmp_path_fixture(self, tmp_path: Path):
        """Use pytest's tmp_path for isolated file operations."""
        # tmp_path is explicitly allowed - isolated per test
        config_file = tmp_path / 'config.yaml'
        config_file.write_text('key: value')
        
        content = config_file.read_text()
        assert content == 'key: value'

    @pytest.mark.small
    def it_uses_pyfakefs(self, fs):
        """Use pyfakefs for complete filesystem virtualization."""
        # fs fixture creates virtual filesystem
        fs.create_file('/etc/app/config.yaml', contents='key: value')
        
        # This reads from virtual filesystem, not real
        with open('/etc/app/config.yaml') as f:
            content = f.read()
        
        assert content == 'key: value'

    @pytest.mark.small
    def it_mocks_file_operations(self, mocker):
        """Mock file operations directly."""
        mock_open = mocker.patch(
            'builtins.open',
            mocker.mock_open(read_data='key: value')
        )
        
        with open('/any/path') as f:
            content = f.read()
        
        assert content == 'key: value'
        mock_open.assert_called_once()

    @pytest.mark.medium
    def it_allows_filesystem_for_medium_tests(self, tmp_path: Path):
        """Medium tests may access real filesystem."""
        # Medium tests can do filesystem I/O
        real_file = tmp_path / 'real.txt'
        real_file.write_text('real data')
        
        assert real_file.read_text() == 'real data'

Demo Script

#!/bin/bash
# demo.sh - Demonstrate pytest-test-categories enforcement

set -e

echo "=========================================="
echo "pytest-test-categories Enforcement Demo"
echo "=========================================="
echo

# Install dependencies
echo "Installing dependencies..."
uv sync --all-groups
echo

# Demo 1: Strict mode (violations fail)
echo "=========================================="
echo "Demo 1: STRICT MODE"
echo "Violations cause test failures"
echo "=========================================="
echo
uv run pytest tests/test_filesystem_violations.py --tc-strict -v || true
echo

# Demo 2: Warn mode (violations logged)
echo "=========================================="
echo "Demo 2: WARN MODE"
echo "Violations logged but tests pass"
echo "=========================================="
echo
uv run pytest tests/test_filesystem_violations.py --tc-warn -v
echo

# Demo 3: Solutions pass in strict mode
echo "=========================================="
echo "Demo 3: SOLUTIONS PASS"
echo "Properly hermetic tests pass in strict mode"
echo "=========================================="
echo
uv run pytest tests/test_filesystem_solutions.py --tc-strict -v
echo

# Demo 4: Full test suite with all sizes
echo "=========================================="
echo "Demo 4: MIXED TEST SIZES"
echo "Small, medium, and large tests together"
echo "=========================================="
echo
uv run pytest tests/ --tc-strict -v --ignore=tests/test_*_violations.py
echo

echo "=========================================="
echo "Demo complete!"
echo "=========================================="

GitHub Actions CI Example

# .github/workflows/test.yml
name: Test

on: [push, pull_request]

jobs:
  test-strict:
    name: Test (Strict Enforcement)
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: astral-sh/setup-uv@v3
      - run: uv sync --all-groups
      - run: uv run pytest --tc-strict

  test-warn:
    name: Test (Warn Mode - Migration)
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: astral-sh/setup-uv@v3
      - run: uv sync --all-groups
      - run: uv run pytest --tc-warn

Dependencies

Definition of Done

  • Demo project structure created
  • All violation examples implemented (5 types)
  • All solution examples implemented (5 types)
  • pyproject.toml with full configuration
  • pytest.ini alternative provided
  • demo.sh script works end-to-end
  • GitHub Actions workflow example
  • README.md with comprehensive documentation
  • MIGRATION.md with adoption patterns
  • All tests pass in appropriate modes
  • Demo is self-contained and runnable
  • PR merged to main

Estimation

Effort: Medium-Large (2-3 days)

  • Many example tests to write
  • Documentation takes time
  • CI/CD examples need testing

Labels

enhancement, documentation, priority-medium

Metadata

Metadata

Assignees

No one assigned

    Labels

    documentationDocumentation improvements, additions, or correctionsenhancementNew feature or improvement to existing functionalitypriority-mediumMedium: Standard priority, will be addressed in normal workflow

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions