Skip to content

Conversation

@Satvik-Singh192
Copy link
Contributor

@Satvik-Singh192 Satvik-Singh192 commented Nov 12, 2025

Description

This PR integrates Optuna hyperparameter optimization with pruning support into the quantitative research framework. The implementation automates hyperparameter search for factor-based strategies, enabling efficient optimization of trading strategy parameters.

Key Changes

  • New tuning/optuna_runner.py module: Core Optuna integration with support for:

    • Automated hyperparameter search with pruning
    • Early stopping of bad trials (MedianPruner, PercentilePruner)
    • Optional RDB storage for distributed tuning runs (SQLite, PostgreSQL, MySQL)
    • Customizable objective functions for backtest-based optimization
    • Trial history tracking and result persistence
  • New CLI command autotune: Command-line interface for hyperparameter tuning with:

    • YAML configuration file support
    • Command-line options for all parameters
    • Integration with existing backtest infrastructure
    • Support for optimizing momentum, volatility, and other factors
  • Example YAML configuration: examples/autotune_config.yaml demonstrating all configuration options

  • Dependencies: Added optuna>=3.0.0 and pyyaml>=6.0 to project dependencies

  • Documentation: Updated README with comprehensive autotune documentation and usage examples

Testing

  • pytest passes locally
  • Linting passes (ruff, black)

Semver Changes

  • Patch (bug fix, no new features)
  • Minor (new features, no breaking changes)
  • Major (breaking changes)

Issues

Closes #95 - autotune: integrate Optuna hyperparameter optimization + pruning

Checklist

@netlify
Copy link

netlify bot commented Nov 12, 2025

Deploy Preview for strong-duckanoo-898b2c ready!

Name Link
🔨 Latest commit 4a3dda8
🔍 Latest deploy log https://app.netlify.com/projects/strong-duckanoo-898b2c/deploys/6915fef93bb7d70008b48e69
😎 Deploy Preview https://deploy-preview-121--strong-duckanoo-898b2c.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify project configuration.

@netlify
Copy link

netlify bot commented Nov 12, 2025

Deploy Preview for qrsopcode canceled.

Name Link
🔨 Latest commit 8f2cbe7
🔍 Latest deploy log https://app.netlify.com/projects/qrsopcode/deploys/6914488dddb7ae0008ef490d

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR integrates Optuna hyperparameter optimization into the quantitative research framework, enabling automated hyperparameter tuning for factor-based trading strategies with support for early stopping via pruning and distributed optimization via RDB storage.

Key changes:

  • Added Optuna integration with pruning strategies (MedianPruner, PercentilePruner) for efficient hyperparameter search
  • Implemented new autotune CLI command with YAML configuration support for optimizing momentum, volatility, and other factors
  • Added dependencies for optuna>=3.0.0 and pyyaml>=6.0 to support the new functionality

Reviewed Changes

Copilot reviewed 6 out of 6 changed files in this pull request and generated 7 comments.

Show a summary per file
File Description
src/quant_research_starter/tuning/optuna_runner.py Core Optuna integration implementing OptunaRunner class, backtest objective function factory, and hyperparameter suggestion utilities
src/quant_research_starter/tuning/init.py Module initialization exporting OptunaRunner
src/quant_research_starter/cli.py Added autotune CLI command with options for data file, factor type, trials, metrics, storage, and pruning configuration
pyproject.toml Added optuna and pyyaml dependencies
examples/autotune_config.yaml Example YAML configuration demonstrating all tuning options
README.md Documentation for hyperparameter tuning feature with usage examples

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@ayushkrtiwari ayushkrtiwari added Semver:minor minor version changes PR:Accept To acknowledge the PR hacktoberfest-accepted Type:Medium junior developers talks Type:Hard senior developers, max points and removed hacktoberfest-accepted Type:Medium junior developers talks labels Nov 13, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

PR:Accept To acknowledge the PR Semver:minor minor version changes Type:Hard senior developers, max points

Projects

None yet

Development

Successfully merging this pull request may close these issues.

autotune: integrate Optuna hyperparameter optimization + pruning

2 participants