Skip to content

Optimize scenario tests: event vs vector comparison tests #388

@MDUYN

Description

@MDUYN

Problem

The event-vs-vector comparison scenario tests (test_event_vs_vector_backtest.py and test_vector_vs_event_backtest_results.py) are too slow for CI (cause Ubuntu runners to timeout). They run both an event-based and vector-based backtest and compare results, which is inherently expensive due to the event backtest engine. They are currently skipped via @unittest.skip.

Required changes

  1. Reduce execution time — These tests run both event and vector backtests over the same date range. The event backtest is the bottleneck (multiple DB round-trips per time step). Use shorter date ranges (e.g. 30–60 days), coarser time frames, or shorter EMA periods to reduce iterations while still generating enough trades for meaningful comparison.

  2. Use only offline test data from tests/resources/test_data/ — All data sources must reference CSV files located in tests/resources/test_data/ (or its subdirectories). Tests should not depend on data files scattered across tests/resources/data/ or tests/resources/market_data_sources_for_testing/. This ensures tests are self-contained, reproducible, and don't break when external data files are reorganized.

Affected files

  • tests/scenarios/test_event_vs_vector_backtest.py
  • tests/scenarios/test_vector_vs_event_backtest_results.py

Acceptance criteria

  • Both tests run in under 60 seconds each (combined event + vector)
  • All tests use data exclusively from tests/resources/test_data/
  • Remove the @unittest.skip decorator once optimized
  • Tests pass on both macOS and Ubuntu CI runners

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions