Skip to content

[ENH] Add LightTS model implementation (v2)#2135

Open
Sylver-Icy wants to merge 14 commits intosktime:mainfrom
Sylver-Icy:add-lightts
Open

[ENH] Add LightTS model implementation (v2)#2135
Sylver-Icy wants to merge 14 commits intosktime:mainfrom
Sylver-Icy:add-lightts

Conversation

@Sylver-Icy
Copy link

Reference Issues/PRs

#2126

What does this implement?

This PR adds an implementation of the LightTS model for the PyTorch Forecasting v2 model framework.
Implemented the LightTS architecture based on the original paper.
The implementation follows the same design and integration patterns used by existing models such as DLinear.

Did you add any tests for the change?

Yes.

Added tests/test_models/test_lightts_v2.py which includes tests for:

  • Model initialization with different hyperparameters
  • Forward pass prediction shape
  • Quantile loss output shape
  • Univariate forecasting behavior

Any other comments?

The implementation aims to stay consistent with existing v2 model implementations (e.g., DLinear) in structure and integration patterns.

Feedback on architectural alignment with the original LightTS design is welcome.

@codecov
Copy link

codecov bot commented Mar 4, 2026

Codecov Report

❌ Patch coverage is 89.54248% with 16 lines in your changes missing coverage. Please review.
⚠️ Please upload report for BASE (main@1952984). Learn more about missing BASE report.

Files with missing lines Patch % Lines
pytorch_forecasting/models/lightts/_lightts_v2.py 88.23% 12 Missing ⚠️
pytorch_forecasting/layers/_blocks/_ie_block.py 84.00% 4 Missing ⚠️
Additional details and impacted files
@@           Coverage Diff           @@
##             main    #2135   +/-   ##
=======================================
  Coverage        ?   86.63%           
=======================================
  Files           ?      169           
  Lines           ?     9884           
  Branches        ?        0           
=======================================
  Hits            ?     8563           
  Misses          ?     1321           
  Partials        ?        0           
Flag Coverage Δ
cpu 86.63% <89.54%> (?)
pytest 86.63% <89.54%> (?)

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

phoeenniixx
phoeenniixx previously approved these changes Mar 4, 2026
Copy link
Member

@phoeenniixx phoeenniixx left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!
Please add docstrings to each class and atleast to all the public methods.

@phoeenniixx phoeenniixx added enhancement New feature or request module:models labels Mar 4, 2026
@Sylver-Icy
Copy link
Author

Sylver-Icy commented Mar 4, 2026

Thanks for the feedback! I’ll add docstrings
For the _get_test_datamodule_from part I’ll check the implementation in timexer and adjust/remove it if needed.

And good point about _IEBlock moving it to the layers module makes sense. I’ll update the structure accordingly.

PranavBhatP
PranavBhatP previously approved these changes Mar 4, 2026
Copy link
Contributor

@PranavBhatP PranavBhatP left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@Sylver-Icy Sylver-Icy dismissed stale reviews from PranavBhatP and phoeenniixx via a942cab March 5, 2026 05:40
@Sylver-Icy
Copy link
Author

@phoeenniixx I kept _IEBlock inside the lightts module for now since it’s currently a private helper used only by lightts. In this repo it looks like layers is mainly used for reusable/shared components across multiple models

since no other model is using it yet, I thought keeping it local would keep things a bit simpler for now xd
If you’d still prefer it moved to layers (or another shared location) just lmk I’m happy to update the structure

@PranavBhatP
Copy link
Contributor

PranavBhatP commented Mar 5, 2026

since no other model is using it yet, I thought keeping it local would keep things a bit simpler for now xd
If you’d still prefer it moved to layers (or another shared location) just lmk I’m happy to update the structure

please move it to the layers folders, all layer modules are meant to be a part of it regardless of whether it is shared or not.

Copy link
Member

@phoeenniixx phoeenniixx left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added some doubts and suggestions

Reshape model output to match v2 prediction format.
"""
if self.n_quantiles is None:
return output
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have a doubt, if n_quantiles is 0 then what will be the shape here? I think for point prediction losses the shape should be (batch,time,1) - as much as I remember. @PranavBhatP please correct me if I am wrong here

Copy link
Author

@Sylver-Icy Sylver-Icy Mar 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

n_quantiles cannot actually be 0 in this implementation. It is either
None or
len(self.loss.quantiles) when using QuantileLoss

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think for point prediction losses the shape should be (batch,time,1) - as much as I remember.

No, point prediction losses simply expect (batch_size, timesteps) - a 2d tensor.

Copy link
Member

@phoeenniixx phoeenniixx Mar 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i meant the output format of the forward it was 3D no? When using a PP loss

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request module:models

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants