Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 6 additions & 4 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ name = "growthcurves"
# See the section below: [tools.setuptools.dynamic]
dynamic = [
"version", # version is loaded from the package
"dependencies", # add if using requirements.txt
# "dependencies", # add if using requirements.txt
]
readme = "README.md"
requires-python = ">=3.9" # test all higher Python versions
Expand All @@ -19,12 +19,14 @@ classifiers = [
# Also update LICENSE file if you pick another one
license = "GPL-3.0-or-later" # https://choosealicense.com/licenses/gpl-3.0/
# # add dependencies here: (use one of the two)
# dependencies = ["numpy", "pandas", "scipy", "matplotlib", "seaborn"]
dependencies = [
"numpy", "scipy", "plotly", "scikit-learn",
]
# use requirements.txt instead of pyproject.toml for dependencies
# https://stackoverflow.com/a/73600610/9684872
# ! uncomment also dependencies in the dynamic section above
[tool.setuptools.dynamic]
dependencies = {file = ["requirements.txt"]}
# [tool.setuptools.dynamic]
# dependencies = {file = ["requirements.txt"]}
Comment on lines 25 to +29
Copy link

Copilot AI Feb 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The outdated comments from the previous requirements.txt approach should be removed. These comments (lines 25-27, 28-29) reference the old approach and are now misleading since dependencies are now directly defined in the dependencies list.

Copilot uses AI. Check for mistakes.


[project.urls]
Expand Down
3 changes: 0 additions & 3 deletions requirements.txt

This file was deleted.

17 changes: 12 additions & 5 deletions src/growthcurves/non_parametric.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,14 @@
from logging import getLogger

import numpy as np
import sklearn.linear_model
Copy link

Copilot AI Feb 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The import style import sklearn.linear_model is inconsistent with the rest of the codebase which uses from <package>.<module> import <class> pattern (e.g., from scipy.interpolate import make_smoothing_spline). Consider using from sklearn.linear_model import HuberRegressor for consistency.

Copilot uses AI. Check for mistakes.
from scipy.interpolate import make_smoothing_spline
from scipy.stats import theilslopes

from .inference import bad_fit_stats

# from scipy.stats import theilslopes


Comment on lines +17 to +19
Copy link

Copilot AI Feb 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The commented-out import from scipy.stats import theilslopes should be removed entirely as it's no longer needed after switching to HuberRegressor.

Suggested change
# from scipy.stats import theilslopes

Copilot uses AI. Check for mistakes.
logger = getLogger(__name__)

# Default settings for the auto-spline (sigma-based smoothing + OD weights).
Expand Down Expand Up @@ -79,7 +82,7 @@ def fit_sliding_window(t, N, window_points=15, step=None, n_fits=None, **kwargs)
step = 1
else:
step = max(1, int(len(t) / n_fits))

huber_regressor = sklearn.linear_model.HuberRegressor()
# limit number of fits to avoid excessive computation using step parameter
for i in range(0, len(t) - w + 1, step):
t_win = t[i : i + w]
Expand All @@ -88,9 +91,13 @@ def fit_sliding_window(t, N, window_points=15, step=None, n_fits=None, **kwargs)
if np.ptp(t_win) <= 0:
continue

# Use Theil-Sen estimator for robust line fitting
result = theilslopes(y_log_win, t_win)
slope, intercept = result.slope, result.intercept
# # Use Theil-Sen estimator for robust line fitting
# result = theilslopes(y_log_win, t_win)
# slope, intercept = result.slope, result.intercept
Comment on lines +94 to +96
Copy link

Copilot AI Feb 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The commented-out code for Theil-Sen should be removed rather than left in place. Since this is a deliberate switch to HuberRegressor and the PR description indicates it's roughly twice as fast with similar results, the old code should be deleted to keep the codebase clean.

Copilot uses AI. Check for mistakes.
# # Use HuberRegressor which uses L2 regularization and is twice as fast as
# # Theil-Sen.
Comment on lines +94 to +98
Copy link

Copilot AI Feb 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The comment has a double hash '# #' which appears to be a formatting issue. This should be cleaned up to use a single '#' for consistency with standard Python comment formatting.

Copilot uses AI. Check for mistakes.
Comment on lines +97 to +98
Copy link

Copilot AI Feb 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The comment states "uses L2 regularization" but this is inaccurate. HuberRegressor uses L2 regularization as a penalty on the coefficients (controlled by the alpha parameter), but the main feature is that it uses the Huber loss function for robustness to outliers, not L2 regularization. The comment should clarify that HuberRegressor is robust to outliers through the Huber loss function.

Suggested change
# # Use HuberRegressor which uses L2 regularization and is twice as fast as
# # Theil-Sen.
# # Use HuberRegressor, which is robust to outliers via the Huber loss
# # and is typically faster than Theil-Sen.

Copilot uses AI. Check for mistakes.
result = huber_regressor.fit(t_win.reshape(-1, 1), y_log_win)
slope, intercept = result.coef_[0], result.intercept_

if slope > best_slope:
best_slope = slope
Expand Down