Traffic forecasting using tensor decomposition and machine learning on METR-LA dataset.
Requires Python 3.9+.
pip install -r requirements.txtPlace the dataset at:
data/processed/METR-LA.h5
Run as module:
Baselines only:
python -m src.main --models ha,naive --quickQuick LSTM (1 epoch):
python -m src.main --models lstm --quickQuick XGBoost:
python -m src.main --models xgb --quickFull run (all models, default settings):
python -m src.main --models ha,naive,lstm,xgbUse CP decomposition too (optional, slower):
python -m src.main --models ha,naive,lstm --use-decomposition --rank 50--data-path(str): Path to H5 file. Default:data/processed/METR-LA.h5--window-size(int): History window length. Default: 12--horizon(int): Forecast steps. Default: 3--models(str): Comma-separated subset ofha,naive,lstm,xgb--quick(flag): Fast run (limits samples, LSTM 1 epoch, fewer XGB trees)--limit-samples(int): Manually cap samples used before split--use-decomposition(flag): Apply CP decomposition to normalized data--rank(int): CP rank if--use-decompositionis set
results/metrics.csv: Summary metrics per model (mae, rmse, mape, smape)results/figures/metrics_comparison.png: Bar charts across modelsresults/figures/lstm_predictions.png: Example plot (if LSTM selected)
src/main.py: Orchestrates the pipeline and CLIsrc/data_utils.py: Load H5, normalize, sliding windowssrc/decomposition.py: CP/Tucker helpers (optional)src/baselines.py: Historical Average, Naive persistencesrc/models/lstm_model.py: PyTorch LSTM forecastersrc/models/xgb_model.py: XGBoost forecaster (per-horizon, multi-output)src/metrics.py: mae, rmse, mape, smapesrc/visualization.py: Plots and results table
- Run from repo root as module:
python -m src.main - Plots saved to
results/figures/