Skip to content

Conversation

@llbbl
Copy link

@llbbl llbbl commented Sep 3, 2025

Set up Python Testing Infrastructure

Summary

This PR establishes a comprehensive testing infrastructure for the dialogue-engine Python project. The setup provides developers with a ready-to-use testing environment that follows modern Python testing best practices.

Changes Made

Package Management

  • Poetry Setup: Created pyproject.toml with Poetry configuration as the primary package manager
  • Dependency Migration: Analyzed existing requirements.txt files and migrated core dependencies
  • Development Dependencies: Added essential testing packages (pytest, pytest-cov, pytest-mock, toml)

Testing Configuration

  • pytest Configuration: Comprehensive pytest settings in pyproject.toml with:

    • Test discovery patterns for files, classes, and functions
    • Custom markers: @pytest.mark.unit, @pytest.mark.integration, @pytest.mark.slow
    • Strict configuration for robust testing
    • Proper warning filters
  • Coverage Settings: Coverage reporting configured with:

    • 80% coverage threshold requirement
    • HTML reports (htmlcov/) and XML output (coverage.xml)
    • Source directories: dialogue-engine/src/programy and nlu
    • Intelligent exclusions for test files, migrations, etc.

Directory Structure

tests/
├── __init__.py
├── conftest.py              # Shared fixtures and test configuration
├── test_infrastructure.py   # Infrastructure validation tests
├── unit/
│   ├── __init__.py
│   └── test_sample_unit.py      # Sample unit tests
└── integration/
    ├── __init__.py
    └── test_sample_integration.py  # Sample integration tests

Shared Test Fixtures

The conftest.py provides comprehensive fixtures for:

  • File System: temp_dir, temp_file, mock_file_system
  • Configuration: sample_config, sample_config_file
  • Mocking: mock_client, mock_context, mock_bot, mock_brain
  • External Services: mock_redis, mock_mongodb, mock_sql, mock_requests
  • AIML Testing: sample_aiml_content, sample_aiml_file
  • Environment: environment_variables, setup_test_environment
  • Utilities: capture_logs, performance_timer

Validation Tests

  • Infrastructure Tests: Verify pytest, coverage, directory structure, and required files
  • Fixture Tests: Validate all shared fixtures work correctly
  • Configuration Tests: Ensure pytest and coverage configurations are valid
  • Sample Tests: Unit and integration test examples with proper markers

Additional Setup

  • Updated .gitignore: Added testing-related entries while preserving lock files
  • Environment Setup: Automatic Python path configuration for test execution
  • Cross-Platform Support: Compatible with different operating systems

Running Tests

Basic Commands

# Run all tests
poetry run pytest

# Run with verbose output
poetry run pytest -v

# Run specific test categories
poetry run pytest -m unit          # Unit tests only
poetry run pytest -m integration   # Integration tests only
poetry run pytest -m "not slow"    # Skip slow tests

# Run tests in specific directories
poetry run pytest tests/unit/
poetry run pytest tests/integration/

# Generate coverage reports
poetry run pytest --cov-report=html  # HTML coverage report

Coverage Reports

  • HTML Report: Open htmlcov/index.html in browser for detailed coverage
  • XML Report: coverage.xml for CI/CD integration
  • Terminal Output: Coverage summary displayed after test runs

Installation

# Install all dependencies
poetry install

# Install only main and dev dependencies
poetry install --only=main,dev

Validation Results

All infrastructure validation tests pass:

  • ✅ 15/15 infrastructure validation tests passed
  • ✅ 9/9 sample unit and integration tests passed
  • ✅ pytest configuration validated
  • ✅ Coverage configuration validated
  • ✅ All fixtures working correctly

Next Steps

Developers can now:

  1. Write Unit Tests: Add tests in tests/unit/ following the sample patterns
  2. Write Integration Tests: Add tests in tests/integration/ using provided fixtures
  3. Use Shared Fixtures: Leverage the comprehensive fixture library in conftest.py
  4. Monitor Coverage: Aim for 80%+ coverage using the built-in reporting
  5. Run Tests in CI: Use poetry run pytest in continuous integration pipelines

Dependencies Added

Main Dependencies:

  • PyYAML: YAML configuration handling
  • requests: HTTP client library

Development Dependencies:

  • pytest ^7.4.0: Modern testing framework
  • pytest-cov ^4.1.0: Coverage reporting
  • pytest-mock ^3.11.1: Mocking utilities
  • toml ^0.10.2: TOML file parsing
  • pylint ^2.17.0: Code linting
  • coverage ^7.2.0: Coverage measurement
  • packaging ^23.0: Package utilities

Notes

  • The infrastructure is designed to be non-intrusive and can coexist with existing test files
  • Coverage threshold is set to 80% but can be adjusted in pyproject.toml
  • Additional dependencies (like NLU-specific packages) are commented out and can be added as needed
  • Lock files (poetry.lock) are intentionally kept and should be committed

- Add Poetry package management with pyproject.toml configuration
- Configure pytest with coverage reporting (HTML/XML), custom markers (unit/integration/slow)
- Create organized test directory structure (tests/unit/, tests/integration/)
- Add comprehensive conftest.py with shared fixtures for testing
- Include validation tests to verify infrastructure setup
- Update .gitignore with testing-related entries
- Set up 80% coverage threshold with proper exclusions
- Add development dependencies: pytest, pytest-cov, pytest-mock, toml

The testing infrastructure is now ready for developers to write tests immediately.
All validation tests pass and coverage reporting is properly configured.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant