Releases: Nixtla/neuralforecast
Releases · Nixtla/neuralforecast
v3.1.5
New features
- [FEAT] Explainability for multivariate models @marcopeix (#1464)
- [FEAT] Add XLinear @marcopeix (#1445)
Fixes
- [FIX] Add protection against duplicated level and quantiles in distribution losses @marcopeix (#1448)
- [FIX] Inverse scaling with MASE loss @marcopeix (#1447)
- [FIX] Exogenous support in TimeXer @marcopeix (#1444)
- [FIX] Correct min_samples formula in conformal prediction for step_size @W057 (#1461)
- [FIX] Fix save() to prevent FileExistsError with DDP @W057 (#1460)
Documentation
v3.1.4
Changes
- [CHORE] Version bump @marcopeix (#1437)
- [CHORE]: fix requirements to build whl @deven367 (#1436)
v3.1.3
Features
- [FEAT] Support normalization for static exogenous features @JuditHalperin (#1406)
- [FEAT] Add support for historical and static exogenous in TimeXer @marcopeix (#1404)
Bug fixes
- [FIX] Prevent inverse normalization of quantile validation loss @marcopeix (#1432)
- [FIX] Fix broken links @deven367 (#1433)
- [FIX] Fix type of n_windows in cross_validation @Riklia (#1378)
- [FIX] Apply windows batch size @marcopeix @elephaint (#1429)
Documentation
- Updates contributing for building documentation @nasaul (#1430)
- Improve grammar and readability in documentation @rogeliomj (#1423)
- Add explanability to the sidebar @deven367 (#1428)
- Fixed missing import @ngupta23 (#1425)
- Migrate to
docs.json@deven367 (#1422) - Lazydocs to mkdocstrings @deven367 (#1399)
- Fix table render @deven367 (#1409)
Changes
- [CHORE] Drop Python 3.9, add Python 3.13 @marcopeix (#1434)
Many thanks to @nasaul for his work on the CI! We usually don't list these PRs in the release notes, but thanks to his work, tests run better for all contributors.
v3.1.2
Features
- [FEAT] Any horizon explanation @marcopeix (#1393)
Bug fixes
- [FIX] Remove torch fix @elephaint (#1392)
v3.1.1
Hotfix
- [FIX] Backwards compatibility with saved models breaks if "explain" attribute isn't present @elephaint (#1389)
v3.1.0
New features
- [FEAT] (Shapley) explanations for univariate forecast models @elephaint @marcopeix (#1377)
- [FEAT] Any horizon prediction @elephaint @JQGoh (#1368)
- [FEAT]: Training window filtering @marcopeix (#1344)
- [FEAT] xLSTM @elephaint (#1363)
Bug fixes
- [FIX]: torch version @marcopeix (#1380)
- [FIX] Fixes for using uv @elephaint (#1364)
- [FIX] Bugs ISQF @elephaint (#1358)
- Update BiTCN doc dropout to match code @LemuelKL (#1376)
- Adds support for static variables in predict_insample method @nasaul (#1349)
- rm unused
y_insamplearg from pytorch losses @deven367 (#1360)
General
- Dev environment installation using uv @JQGoh (#1362)
- Enhancements @deven367 (#1371)
- Codespace setup for Neuralforecast @JQGoh (#1382)
- migrate tests from nbs to pytest @deven367 (#1353)
Documentation
v3.0.2
Enhancements
- Distributional predictions in
predict_insample()@janovergoor (#1309) - Optimizations in tsdataset: reduce allocations for large datasets @tylernisonoff (#1335)
Fixes
- [FIX]: Add logic to load custom models when using ReduceLROnPlateau @marcopeix (#1340)
- [FIX]: Fixes incorrect cuts in conformal prediction with conformal_error @elephaint (#1331)
v3.0.1
Features
- FEAT: Select basis functions in NBEATS @tblume1992 @marcopeix (#1191)
- FEAT: Add flash-attention @LeonEthan (#1295)
- FEAT: HuberIQLoss @elephaint (#1307)
Bug Fixes
- FIX: Fix iPython version @elephaint (#1282)
- FIX: Recurrent predictions @elephaint (#1285)
- FIX: Fix poor performance with the NegativeBinomial DistributionLoss @JQGoh (#1289)
- FIX: Add exclude_insample_y param to TimeXer for model loading @marcopeix (#1306)
- FIX: Set 2.0.0<=pytorch<=2.6.0 to avoid conflicts with networkx with Python 3.9 @marcopeix (#1318)
- FIX: Create windows once @elephaint (#1325)
- FIX: Add h_train to RNNs & fix issue with input_size @elephaint (#1326)
- FIX: Allow static vars only with NBEATSx and exogenous block @marcopeix (#1319)
v3.0.0
New features
- FEAT: TimeXer @marcopeix (#1267)
- All losses compatible with all types of models (e.g. univariate/multivariate, direct/recurrent) OR appropriate protection added.
- DistributionLoss now supports the use of
quantilesinpredict, allowing for easy quantile retrieval for allDistributionLosses. - Mixture losses (GMM, PMM and NBMM) now support learned weights for weighted mixture distribution outputs.
- Mixture losses now support the use of
quantilesinpredict, allowing for easy quantile retrieval. - Improved stability of
ISQFby adding softplus protection around some parameters instead of using.abs. - Unified API for any quantile or any confidence level during predict for both point and distribution losses.
Enhancements
- [DOCS] Docstrings @elephaint (#1279)
- FIX: Minor bug fix in TFT and a nicer error message for fitting with the wrong val_size @marcopeix (#1275)
- [FIX] Adds bfloat16 support @elephaint (#1265)
- Recurrent models can now produce forecasts recursively or directly.
- IQLoss now gives monotonic quantiles
- MASE loss now works
Breaking Changes
- [FIX] Unify API @elephaint (#1023)
- RMoK uses the
revin_affineparameter instead ofrevine_affine. This was a typo in the previous version. - All models now inherit the
BaseModelclass. This changes how we implement new models in neuralforecast. - Recurrent models now require an
input_sizeparameter. TCNandDRNNare now window models, not recurrent models- We cannot load a recurrent model from a previous version to v3.0.0
Bug Fixes
- [FIX] Multivariate models give error when predicting when n_series > batch_size @elephaint (#1276)
- [FIX]: Insample predictions with series of varying lengths @marcopeix (#1246)
Documentation
- [DOCS] Update documentation @elephaint (#1274)
- [DOCS] Add example of modifying the default configure_optimizers() behavior (use of ReduceLROnPlateau scheduler) @JQGoh (#1015)
v2.0.1
Enhancements
- FEAT: Custom RNN layers for TFT @Yanam24 (#1230)
- FEAT: Add the horizon weighing to the distribution losses @mwamsojo (#1233)
Documentation
- DOCS: Add citation note @elephaint (#1244)
- fix: azul @AzulGarza (#1245)