Skip to content

Haaziq386/Modified-Dart

Repository files navigation

HtulTS - Lightweight Time Series Model with Forgetting Mechanisms

Overview

HtulTS is a lightweight time series model designed for both pretraining and downstream tasks (forecasting and classification). It features a shared linear backbone architecture with task-specific heads and optional forgetting mechanisms for improved transfer learning capabilities.
⚠️This is a experimentation and in progress repository aiming to make a good Time Series Forecaster.

Key Features

  • Shared Linear Backbone: Efficient MLP-based backbone for feature extraction across all tasks
  • Task-Specific Heads: Separate heads optimized for:
    • Pretraining: Denoising autoencoder for representation learning
    • Forecasting: Regression head for time series prediction
    • Classification: Convolutional layers for time series classification
  • Forgetting Mechanisms: Multiple strategies to reduce catastrophic forgetting during fine-tuning
  • Noise Injection: Optional Gaussian noise during pretraining for robust feature learning

Architecture Components

1. ForgettingMechanisms Module

Implements three types of forgetting strategies to prevent catastrophic forgetting:

class ForgettingMechanisms(nn.Module):
    """Learnable forgetting gates and adaptive mechanisms"""

Forgetting Types:

  • activation: Learnable forgetting gates applied to activation patterns
  • weight: Weight-level importance scoring with decay
  • adaptive: Adaptive forgetting based on gradient magnitude

2. LightweightModel

Core model architecture with shared backbone and task-specific heads:

class LightweightModel(nn.Module):
    """
    Shared linear backbone + task-specific heads
    """

Shared Backbone:

  • 2-layer MLP with ReLU activations
  • Hidden dimension: 512
  • Dropout: 0.2 and 0.1

Task-Specific Heads:

Task Head Type Components
Pretraining Linear Decoder Single linear layer mapping to input length
Forecasting Regression Head 2-layer MLP with dropout for prediction length
Classification CNN Classifier 3-layer Conv1d + adaptive pooling + classifier

3. Model (Forecasting)

Wrapper for pretraining and forecasting tasks:

class Model(nn.Module):
    """TimeDART forecasting model"""

Methods:

  • pretrain(): Denoising autoencoder training
  • forecast(): Time series forecasting
  • Automatic normalization/denormalization

4. ClsModel (Classification)

Specialized model for classification tasks:

class ClsModel(nn.Module):
    """TimeDART classification model"""

Configuration Parameters

Model Parameters

args.input_len          # Input sequence length (default: 336)
args.pred_len           # Prediction length (for forecasting)
args.enc_in             # Input channels/features
args.task_name          # "pretrain" or "finetune"
args.downstream_task    # "forecast" or "classification"
args.use_norm           # Apply normalization (default: True)

Noise Parameters

args.use_noise          # Enable noise injection during pretraining
args.noise_level        # Noise scaling factor (default: 0.1)

Forgetting Parameters

args.use_forgetting     # Enable forgetting mechanisms
args.forgetting_type    # "activation", "weight", or "adaptive"
args.forgetting_rate    # Forgetting rate (default: 0.1)

Training and Evaluation

sh scripts/pretrain/ETTh2.sh && sh scripts/finetune/ETTh2.sh

Related Files

Notes

  • Checkpoints are saved to ./outputs/pretrain_checkpoints/ during pretraining
  • Fine-tuning checkpoints go to ./outputs/checkpoints/
  • Test results and metrics saved to ./outputs/test_results/
  • TensorBoard logs available in ./outputs/logs/

QUICK_REFERENCE.sh - Copy-paste commands for logging

============================================================

QUICK START - Copy and paste these commands

============================================================

RUN EVERYTHING WITH LOGGING (RECOMMENDED):

sh scripts/run_with_logs.sh ETTh1

RUN PRETRAIN ONLY:

sh scripts/pretrain_ETTh1_with_logs.sh

RUN FINETUNE ONLY:

sh scripts/finetune_ETTh1_with_logs.sh

RUN BOTH (CHAINED):

sh scripts/pretrain_ETTh1_with_logs.sh && sh scripts/finetune_ETTh1_with_logs.sh

MANUAL LOG VIEWING:

cat outputs/logs/pretrain_ETTh1_.log | tail -50 cat outputs/logs/finetune_ETTh1_.log | tail -50 ls -lh outputs/logs/

============================================================

KEY POINTS:

============================================================

1. All logs saved to: outputs/logs/

2. Log format: <YYYYMMDD_HHMMSS>.log

3. Output appears on terminal AND saved to file

4. Supports any dataset name automatically

5. Use log_viewer.sh to quickly check latest logs

============================================================

About

Time series forecasting and classification using shared linear backbone with task-specific heads and forgetting mechanisms

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors