-
Notifications
You must be signed in to change notification settings - Fork 7
feat: optuna #121
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
feat: optuna #121
Conversation
✅ Deploy Preview for strong-duckanoo-898b2c ready!
To edit notification comments on pull requests, go to your Netlify project configuration. |
✅ Deploy Preview for qrsopcode canceled.
|
383edc4 to
8f2cbe7
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR integrates Optuna hyperparameter optimization into the quantitative research framework, enabling automated hyperparameter tuning for factor-based trading strategies with support for early stopping via pruning and distributed optimization via RDB storage.
Key changes:
- Added Optuna integration with pruning strategies (MedianPruner, PercentilePruner) for efficient hyperparameter search
- Implemented new
autotuneCLI command with YAML configuration support for optimizing momentum, volatility, and other factors - Added dependencies for
optuna>=3.0.0andpyyaml>=6.0to support the new functionality
Reviewed Changes
Copilot reviewed 6 out of 6 changed files in this pull request and generated 7 comments.
Show a summary per file
| File | Description |
|---|---|
| src/quant_research_starter/tuning/optuna_runner.py | Core Optuna integration implementing OptunaRunner class, backtest objective function factory, and hyperparameter suggestion utilities |
| src/quant_research_starter/tuning/init.py | Module initialization exporting OptunaRunner |
| src/quant_research_starter/cli.py | Added autotune CLI command with options for data file, factor type, trials, metrics, storage, and pruning configuration |
| pyproject.toml | Added optuna and pyyaml dependencies |
| examples/autotune_config.yaml | Example YAML configuration demonstrating all tuning options |
| README.md | Documentation for hyperparameter tuning feature with usage examples |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
Co-authored-by: Copilot <[email protected]>
Co-authored-by: Copilot <[email protected]>
Co-authored-by: Copilot <[email protected]>
Co-authored-by: Copilot <[email protected]>
fabdd77 to
5203024
Compare
Description
This PR integrates Optuna hyperparameter optimization with pruning support into the quantitative research framework. The implementation automates hyperparameter search for factor-based strategies, enabling efficient optimization of trading strategy parameters.
Key Changes
New
tuning/optuna_runner.pymodule: Core Optuna integration with support for:New CLI command
autotune: Command-line interface for hyperparameter tuning with:Example YAML configuration:
examples/autotune_config.yamldemonstrating all configuration optionsDependencies: Added
optuna>=3.0.0andpyyaml>=6.0to project dependenciesDocumentation: Updated README with comprehensive autotune documentation and usage examples
Testing
pytestpasses locallyruff,black)Semver Changes
Issues
Closes #95 - autotune: integrate Optuna hyperparameter optimization + pruning
Checklist