Skip to content

Commit e0c9bd8

Browse files
authored
v4.4.48
Release v4.4.48
2 parents d14efaf + 51f187f commit e0c9bd8

32 files changed

+1076
-662
lines changed

AGENTS.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -34,6 +34,7 @@ This repo is frequently edited by **multiple AI sessions**. To avoid lost work:
3434
- **Test gating (STRICT):** do not introduce new environment variables just to skip/disable tests or to paper over CI failures.
3535
- Prefer existing pytest markers (`apitest`, `acceptance_backtest`, etc.) and normal test skips with clear reasons.
3636
- If a new env var is truly required for a user-facing feature, document it in `docsrc/environment_variables.rst` in the same PR.
37+
- **Full suite verification:** prefer pushing commits to GitHub on the shared `version/X.Y.Z` branch so sharded CI validates the full suite. Local runs should focus on targeted tests or marker-filtered subsets.
3738

3839
1. **Never launch ThetaTerminal locally WITH PRODUCTION CREDENTIALS.** Production has the only licensed session for that account. Starting the jar with prod credentials (even briefly or via Docker) instantly terminates the prod connection and halts all customers.
3940
2. **Use the downloader for backtests.** All tests/backtests must set `DATADOWNLOADER_BASE_URL` and `DATADOWNLOADER_API_KEY` via the runtime environment. Do not short-cut by hitting Theta directly.

CHANGELOG.md

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,19 @@
11
# Changelog
22

3+
## 4.4.48 - 2026-02-10
4+
5+
### Added
6+
- Backtesting artifacts: emit Parquet siblings for `*_indicators.csv`, `*_trades.csv`, `*_stats.csv`, and `*_trade_events.csv` (zstd + PyArrow). CSV remains the compatibility layer.
7+
8+
### Changed
9+
- Tradier: support OAuth payload + access token refresh; add runtime notes for the refresh flow.
10+
- Tests: mark DataBento backtest coverage as `apitest` so the default CI suite stays deterministic without vendor credentials.
11+
- Docs: clarify auto-expiry futures behavior and IBKR crypto roots.
12+
13+
### Fixed
14+
- Data: handle `timeshift=None` in Data bars.
15+
- Futures (auto-expiry): make selection roll-aware and harden IBKR conid negative cache behavior.
16+
317
## 4.4.47 - 2026-02-07
418
### Added
519
- Backtesting: support `BACKTESTING_BUDGET` environment override for strategy backtest cash/budget.

docs/BACKTESTING_TESTS.md

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,17 @@ This project relies on a layered test strategy:
77
3. **Acceptance backtests** (manual, end-to-end): run from `Strategy Library/` and inspect artifacts
88
(`*_trades.html`, `*_tearsheet.html`, `*_stats.csv`) for realism and regressions.
99

10+
## Recommended workflow (local + GitHub CI)
11+
12+
Local runs are great for fast feedback, but the full suite is often faster and more reliable on GitHub because CI runs tests in parallel (sharded).
13+
14+
Recommended approach on `version/X.Y.Z` branches:
15+
16+
- Run targeted tests locally for quick iteration.
17+
- Push early/often so GitHub CI can run the full sharded suite for release confidence.
18+
19+
Treat **green GitHub CI** as the “release-ready” signal since it matches the release workflow environment more closely than a single local machine.
20+
1021
## Test authority (“Legacy tests win”)
1122

1223
When tests fail, **how you fix them depends on how old the test is**:

docs/DEPLOYMENT.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -86,6 +86,12 @@ Publishing is **tag-driven** via `.github/workflows/release.yml`.
8686
- Confirm there are no: `*.env`, `*.log`, `dist/`, `tmp/`, large stray binaries, or accidental artifacts.
8787
- Manual code review (security, best-effort):
8888
- Scan the diff for unexpected behavior: new process execution, credential handling, network calls, filesystem writes, or workflow changes.
89+
- Explicitly look for “malicious” indicators:
90+
- obfuscated code (large base64 blobs, weird string concatenation around URLs/commands)
91+
- `eval`/`exec`, unsafe deserialization (`pickle.loads`) in new code
92+
- new network destinations / hard-coded private endpoints
93+
- silent secret capture/exfil paths (reading `.env`, keychains, `~/.ssh`, AWS creds)
94+
- changes under `.github/workflows/` (must be intentional and reviewed)
8995
- If new/renamed modules were added, ensure they’re “boring” (no hidden side effects at import time).
9096
- If any new binary is added, confirm it’s expected and justified (size + provenance).
9197
- If anything feels off, stop and escalate before merging/releasing.

docs/IBKR_FUTURES_BACKTESTING.md

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -77,6 +77,22 @@ IBKR futures bar history requires a specific contract identifier (conid). LumiBo
7777
For deterministic acceptance and correct lookup behavior, **explicit futures** should be used:
7878
- `Asset("MES", asset_type="future", expiration=date(2025, 12, 19))`
7979

80+
### Auto-expiry futures (rolling wrapper; recommended when you don't want to hardcode an expiration)
81+
82+
If you want “front month” behavior without hardcoding an expiration date, use an auto-expiry futures asset:
83+
- `Asset("MES", asset_type="future", auto_expiry=Asset.AutoExpiry.FRONT_MONTH)`
84+
85+
Semantics:
86+
- This is a **rolling wrapper** resolved by the data source/broker as time advances (similar to `cont_future` stitching).
87+
- `Asset.expiration` remains `None` unless you explicitly provide `expiration=...`.
88+
- This keeps backtests and live behavior aligned and avoids `date.today()` skew when backtesting historical periods.
89+
90+
### Crypto futures roots (IBKR / CME)
91+
92+
IBKR’s futures root symbols can differ from spot tickers, especially for CME crypto products. Examples:
93+
- Bitcoin: `MBT` (Micro Bitcoin futures), `BRR` (Bitcoin Reference Rate futures root)
94+
- Ether: `MET` (Micro Ether futures), `ETHUSDRR` (Ether Reference Rate futures root)
95+
8096
### Expired contracts (critical)
8197

8298
IBKR Client Portal cannot reliably discover conids for **expired** futures. For backtests that reference expired
Lines changed: 66 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,66 @@
1+
# PARQUET BACKTEST ARTIFACTS (BOTSPOT)
2+
3+
> Emit Parquet siblings for backtest result artifacts (indicators/trades/stats/trade_events) to speed BotSpot analysis and UI queries.
4+
5+
**Last Updated:** 2026-02-10
6+
**Status:** Draft
7+
**Audience:** Developers + AI Agents
8+
9+
---
10+
11+
## Overview
12+
13+
BotSpot frequently queries `*_indicators.csv` (and related artifacts) via DuckDB inside `botspot_node`. CSV parsing + repeated scans can be slow (10s–40s per query in worst cases). This change makes LumiBot emit Parquet versions of the same artifacts so downstream services can prefer Parquet (faster scans, typed columns, compression) while keeping CSV as the compatibility layer.
14+
15+
This is intentionally additive:
16+
17+
- **CSV stays the source-of-compatibility** (existing consumers keep working).
18+
- **Parquet becomes the source-of-speed** for analytics (DuckDB, UI endpoints, agent tools).
19+
20+
## What Changed
21+
22+
LumiBot now attempts to write these Parquet artifacts alongside existing outputs:
23+
24+
- `*_indicators.parquet` (sibling of `*_indicators.csv`)
25+
- `*_trades.parquet` (sibling of `*_trades.csv`)
26+
- `*_stats.parquet` (sibling of `*_stats.csv`)
27+
- `*_trade_events.parquet` (sibling of `*_trade_events.csv`)
28+
29+
Implementation notes:
30+
31+
- Uses `pandas.DataFrame.to_parquet(engine="pyarrow", compression="zstd")`.
32+
- **Best-effort:** Parquet export failures are logged as warnings and do **not** fail a backtest (CSV remains the fallback).
33+
34+
## Why This Helps Performance
35+
36+
Downstream (BotSpot) improvements unlocked by emitting Parquet:
37+
38+
- DuckDB scans Parquet without CSV parsing/inference overhead.
39+
- Parquet is columnar, so projecting a few columns is much cheaper than reading entire CSV rows.
40+
- Enables Parquet-first behavior in `botspot_node` (fallback to CSV when Parquet is missing).
41+
42+
## Verification (Local)
43+
44+
Parquet generation is covered by unit tests:
45+
46+
- Indicators Parquet: `python3 -m pytest tests/test_indicators_detail_text_edge_cases.py -q`
47+
- Trades Parquet: `python3 -m pytest tests/test_indicator_subplots.py::test_plot_returns_preserves_cash_settled_status -q`
48+
- Stats Parquet: `python3 -m pytest tests/test_strategy_dump_stats_regression.py -q`
49+
- Trade events Parquet: `python3 -m pytest tests/test_backtesting_broker.py -q`
50+
51+
Manual sanity check (after running any backtest with `show_indicators=True`):
52+
53+
- Confirm `logs/*_indicators.csv` and `logs/*_indicators.parquet` both exist and have the same row count.
54+
55+
## Rollout Order (BotSpot)
56+
57+
1. Deploy LumiBot (so new backtests produce Parquet artifacts).
58+
2. Deploy Bot Manager upload pipeline (so Parquet artifacts get uploaded with results).
59+
3. Deploy `botspot_node` (Parquet-first querying + timing logs).
60+
4. Deploy `botspot_react` (ensure indicators Parquet remains non-downloadable in UI).
61+
62+
## Risks / Notes
63+
64+
- Parquet typing can differ from DuckDB CSV inference (usually a net improvement). We rely on CSV fallback if anything is missing.
65+
- This feature assumes `pyarrow` is present in the runtime image. (It is already pinned in LumiBot deps; if it is absent, Parquet writes will warn and CSV continues to work.)
66+
Lines changed: 66 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,66 @@
1+
# Tradier OAuth Payload + Refresh Support (LumiBot)
2+
3+
Date: 2026-02-07
4+
Owner: Codex (implementation + research)
5+
Status: Implemented in `lumibot` (commit `52eef70b`)
6+
7+
## Why this exists
8+
9+
BotSpot deployments can link Tradier via OAuth. Tradier OAuth access tokens expire (per Tradier docs), so a long-running bot needs refresh-token support (when available) to keep trading without requiring users to re-link every day.
10+
11+
## Runtime Inputs (env vars)
12+
13+
This implementation supports both existing “manual token” Tradier usage and OAuth usage.
14+
15+
### Existing (manual token)
16+
17+
- `TRADIER_ACCESS_TOKEN`
18+
- `TRADIER_ACCOUNT_NUMBER`
19+
- `TRADIER_IS_PAPER` (`true`/`false`)
20+
21+
### OAuth mode (BotSpot)
22+
23+
BotSpot injects:
24+
25+
- `TRADIER_TOKEN`
26+
- base64url JSON payload from the OAuth token exchange
27+
- expected fields: `access_token`, `expires_in` (optional), `issued_at` (optional), `refresh_token` (optional)
28+
- `TRADIER_REFRESH_TOKEN` (optional)
29+
- `TRADIER_OAUTH_CLIENT_ID` (required to refresh)
30+
- `TRADIER_OAUTH_CLIENT_SECRET` (required to refresh)
31+
32+
Notes:
33+
34+
- Tradier refresh tokens are **partner-only**; if `TRADIER_REFRESH_TOKEN` is not present, the bot can still start, but it may stop working once the access token expires.
35+
36+
## What changed
37+
38+
File: `lumibot/brokers/tradier.py`
39+
40+
- Added support for decoding `TRADIER_TOKEN` (base64url JSON).
41+
- If `access_token` is missing/blank in config/args, it is sourced from the decoded payload.
42+
- Added best-effort refresh support via:
43+
- proactive refresh near expiry (when expiry metadata exists)
44+
- forced refresh and single retry when an API call fails with `401`
45+
- Refresh uses Tradier endpoint:
46+
- `POST https://api.tradier.com/v1/oauth/refreshtoken`
47+
- Basic Auth: `TRADIER_OAUTH_CLIENT_ID:TRADIER_OAUTH_CLIENT_SECRET`
48+
- Body: `grant_type=refresh_token&refresh_token=<TRADIER_REFRESH_TOKEN>`
49+
- When a refresh succeeds, the broker updates the access token across:
50+
- broker’s `lumiwealth_tradier` client
51+
- data source’s `lumiwealth_tradier` client
52+
53+
## Limitations / Operational Notes
54+
55+
- If Tradier rotates the refresh token on refresh, the new refresh token cannot be persisted back into task env vars.
56+
- The code logs a warning if refresh-token rotation is detected.
57+
- If rotation happens, users may need to re-link after the old refresh token becomes invalid.
58+
59+
## Tests
60+
61+
File: `tests/test_tradier.py`
62+
63+
- Added unit tests for:
64+
- decoding OAuth payload into `access_token`
65+
- refreshing token when payload is expired (requests mocked; no network)
66+

lumibot/brokers/broker.py

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2367,6 +2367,20 @@ def export_trade_events_to_csv(self, filename):
23672367
output_df = self._trade_event_log_df.set_index("time")
23682368
output_df.to_csv(filename)
23692369

2370+
parquet_filename = (
2371+
filename[:-4] + ".parquet" if str(filename).lower().endswith(".csv") else str(filename) + ".parquet"
2372+
)
2373+
try:
2374+
self._trade_event_log_df.to_parquet(
2375+
parquet_filename,
2376+
index=False,
2377+
engine="pyarrow",
2378+
compression="zstd",
2379+
)
2380+
except Exception as exc:
2381+
# Never fail end-of-run export due to parquet; CSV is the compatibility layer.
2382+
self.logger.warning("Failed to write trade events parquet file %s: %s", parquet_filename, exc)
2383+
23702384
def set_strategy_name(self, strategy_name):
23712385
"""
23722386
Let's the broker know the name of the strategy that is using it for logging purposes.

0 commit comments

Comments
 (0)