Skip to content

Lafitte1573/TF-CoDiT

Repository files navigation

TF-CoDiT: Conditional Time Series Synthesis with Diffusion Transformers for Treasury Futures

Anonymous Authors

arXiv Python PyTorch License

Contents

Setup

# Create environment (Python ~3.10)
conda create -n your-project python=3.10
conda activate tf-codit

# Clone repository
git clone https://github.com/username/repo.git  # this well be named once our repo is public
cd repo

# Install dependencies
pip install -r requirements.txt

Data Preparation

Download the data from Google Drive and extract the files to data/

python data_process.py \
  --data_dir data \
  --output_dir data/processed

Training

Train U-VAE

torchrun \
    --nproc_per_node=4 \
    --nnodes=1 \
    --node_rank=0 \
    --master_addr=localhost \
    --master_port=12355 \
    vae/train_vae.py \
    --config_file configs/vae/ts-vae.yaml

Train DiT

deepspeed train.py -c configs/dit/gemma-it.yaml

Configs live in configs/. Adjust batch size, model paths, etc. as needed.

Inference

1. Convert checkpoint to diffusers pipeline:

python sample.py \
    --fusedit_config configs/fusedit/config.yaml \
    --fusedit_checkpoint outputs/fusedit \
    --vae_checkpoint outputs/vae_for_dwt_d64 \
    --prompt "generate TF contract from 2025-01-01 to 2025-02-01" \
    --num_inference_steps 50 \
    --guidance_scale 7.0 \
    --output_dir ./results

Citation

@article{author2026tf-codit,
  title  = {TF-CoDiT: Conditional Time Series Synthesis with Diffusion Transformers for Treasury Futures},
  author = {Anonymous Authors},
  year   = {2026},
  journal = {arXiv preprint arXiv:2601.xxxxx}
}

About

The official implementation of TF-DiT.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages