Learn visual autoregressive
- VAR (NeurIPS'2024 Best Paper Award)
Baseline Models
- Infinity (CVPR'2025 Oral)
Acceleration Method
- FastVAR (ICCV'2025)
- SparseVAR (ICCV'2025)
- ScaleKV (NeurIPS'2025)
- SkipVAR
- Some customized Kernels are written for Hopper GPUs, and depend on optimizations specific to CUDA Toolkit version ≥ 12.8 (recommend
12.8.1!). - For PyTorch, the recommended version is
2.7.1or later.
conda create -n torch271 python=3.12
# for CUDA 12.8
pip install torch==2.7.1 torchvision==0.22.1 torchaudio==2.7.1 --index-url https://download.pytorch.org/whl/cu128
# flash-attention
MAX_JOBS=16 pip install flash-attn --no-build-isolation
cd Lvar
pip install -r requirements.txt
# Since dev, the Python path should be set manually
vim ~/.bashrc
export PYTHONPATH=$PYTHONPATH:{your-path}/LvarOur SparVAR
HART
cd models/hart/kernels
bash install.shDownload flan-t5-xl.
from transformers import T5Tokenizer, T5ForConditionalGeneration
tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-xl")
model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-xl")These three lines will download flan-t5-xl to your ~/.cache/huggingface directory.
or
cd pretrained_models/infinity
bash hf_down.shIf you want to download all the weights at once, please refer to
mkdir pretrained_models/infinity/Infinity
cd pretrained_models/infinity/Infinity
huggingface-cli download FoundationVision/Infinity --local-dir ./Download the commonly used weights, please refer to
mkdir pretrained_models/infinity/Infinity
cd pretrained_models/infinity/Infinity
huggingface-cli download FoundationVision/Infinity --include="infinity_vae_d32reg.pth" --local-dir ./
huggingface-cli download FoundationVision/Infinity --include="infinity_2b_reg.pth" --local-dir ./For more models, please refer to the readme of each model in the pretrained_models/ directory.
We provide code and corresponding scripts for various benchmarks.
Please refer to the following readme for different benchmarks.
The Lvar codebase is adapted from VAR and Infinity. Special thanks to their excellent works!