Skip to content

Commit f8e5dfc

Browse files
committed
first commit
0 parents  commit f8e5dfc

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

54 files changed

+8748
-0
lines changed

.gitignore

Lines changed: 161 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,161 @@
1+
# Byte-compiled / optimized / DLL files
2+
__pycache__/
3+
*.py[cod]
4+
*$py.class
5+
6+
# C extensions
7+
*.so
8+
9+
# Distribution / packaging
10+
.Python
11+
build/
12+
develop-eggs/
13+
dist/
14+
downloads/
15+
eggs/
16+
.eggs/
17+
lib/
18+
lib64/
19+
parts/
20+
sdist/
21+
var/
22+
wheels/
23+
pip-wheel-metadata/
24+
share/python-wheels/
25+
*.egg-info/
26+
.installed.cfg
27+
*.egg
28+
MANIFEST
29+
30+
# PyInstaller
31+
# Usually these files are written by a python script from a template
32+
# before PyInstaller builds the exe, so as to inject date/other infos into it.
33+
*.manifest
34+
*.spec
35+
36+
# Installer logs
37+
pip-log.txt
38+
pip-delete-this-directory.txt
39+
40+
# Unit test / coverage reports
41+
htmlcov/
42+
.tox/
43+
.nox/
44+
.coverage
45+
.coverage.*
46+
.cache
47+
nosetests.xml
48+
coverage.xml
49+
*.cover
50+
*.py,cover
51+
.hypothesis/
52+
.pytest_cache/
53+
54+
# Translations
55+
*.mo
56+
*.pot
57+
58+
# Django stuff:
59+
*.log
60+
local_settings.py
61+
db.sqlite3
62+
db.sqlite3-journal
63+
64+
# Flask stuff:
65+
instance/
66+
.webassets-cache
67+
68+
# Scrapy stuff:
69+
.scrapy
70+
71+
# Sphinx documentation
72+
docs/_build/
73+
74+
# PyBuilder
75+
target/
76+
77+
# Jupyter Notebook
78+
.ipynb_checkpoints
79+
80+
# IPython
81+
profile_default/
82+
ipython_config.py
83+
84+
# pyenv
85+
.python-version
86+
87+
# pipenv
88+
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
89+
# However, in case of collaboration, if having platform-specific dependencies or dependencies
90+
# having no cross-platform support, pipenv may install dependencies that don't work, or not
91+
# install all needed dependencies.
92+
#Pipfile.lock
93+
94+
# PEP 582; used by e.g. github.com/David-OConnor/pyflow
95+
__pypackages__/
96+
97+
# Celery stuff
98+
celerybeat-schedule
99+
celerybeat.pid
100+
101+
# SageMath parsed files
102+
*.sage.py
103+
104+
# Environments
105+
.env
106+
.venv
107+
env/
108+
venv/
109+
ENV/
110+
env.bak/
111+
venv.bak/
112+
113+
# Spyder project settings
114+
.spyderproject
115+
.spyproject
116+
117+
# Rope project settings
118+
.ropeproject
119+
120+
# mkdocs documentation
121+
/site
122+
123+
# mypy
124+
.mypy_cache/
125+
.dmypy.json
126+
dmypy.json
127+
128+
# Pyre type checker
129+
.pyre/
130+
/scripts/long_term_forecast/Traffic_script/PatchTST1.sh
131+
/backups/
132+
/result.xlsx
133+
/~$result.xlsx
134+
/Time-Series-Library.zip
135+
/temp.sh
136+
137+
.idea
138+
/tv_result.xlsx
139+
/test.py
140+
/m4_results/
141+
/test_results/
142+
/PatchTST_results.xlsx
143+
/seq_len_long_term_forecast/
144+
/progress.xlsx
145+
/scripts/short_term_forecast/PatchTST_M4.sh
146+
/run_tv.py
147+
148+
/scripts/long_term_forecast/ETT_tv_script/
149+
/dataset/
150+
data_factory_all.py
151+
data_loader_all.py
152+
/scripts/short_term_forecast/tv_script/
153+
/exp/exp_short_term_forecasting_tv.py
154+
/exp/exp_long_term_forecasting_tv.py
155+
/timesnetv2.xlsx
156+
/scripts/anomaly_detection/tmp/
157+
/scripts/imputation/tmp/
158+
/utils/self_tools.py
159+
160+
checkpoints
161+
logs

LICENSE

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
MIT License
2+
3+
Copyright (c) 2024
4+
5+
Permission is hereby granted, free of charge, to any person obtaining a copy
6+
of this software and associated documentation files (the "Software"), to deal
7+
in the Software without restriction, including without limitation the rights
8+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9+
copies of the Software, and to permit persons to whom the Software is
10+
furnished to do so, subject to the following conditions:
11+
12+
The above copyright notice and this permission notice shall be included in all
13+
copies or substantial portions of the Software.
14+
15+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21+
SOFTWARE.

README.md

Lines changed: 134 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,134 @@
1+
# Unified Time Series Model
2+
3+
[**Project Page**](https://zitniklab.hms.harvard.edu/projects/UniTS/) | [**Paper link**]()
4+
5+
UniTS is a unified time series model that can process various tasks across multiple domains with shared parameters and does not have any task-specific modules.
6+
7+
Authors: [Shanghua Gao](https://shgao.site/) [Teddy Koker](https://teddykoker.com) [Owen Queen](https://owencqueen.github.io/) [Thomas Hartvigsen](https://www.tomhartvigsen.com/) [Theodoros Tsiligkaridis](https://sites.google.com/view/theo-t) [Marinka Zitnik](https://zitniklab.hms.harvard.edu/)
8+
9+
## Overview
10+
Foundation models, especially LLMs, are profoundly transforming deep learning. Instead of training many task-specific models, we can adapt a single pretrained model to many tasks via few-shot prompting or fine-tuning. However, current foundation models apply to sequence data but not to time series, which present unique challenges due to the inherent diverse and multi-domain time series datasets, diverging task specifications across forecasting, classification and other types of tasks, and the apparent need for task-specialized models.
11+
12+
We developed UniTS, a unified time series model that supports a universal task specification, accommodating classification, forecasting, imputation, and anomaly detection tasks. This is achieved through a novel unified network backbone, which incorporates sequence and variable attention along with a dynamic linear operator and is trained as a unified model.
13+
14+
Across 38 multi-domain datasets, UniTS demonstrates superior performance compared to task-specific models and repurposed natural language-based LLMs. UniTS exhibits remarkable zero-shot, few-shot, and prompt learning capabilities when evaluated on new data domains and tasks.
15+
16+
<p align="center">
17+
<img src="https://zitniklab.hms.harvard.edu/img/UniTS-1.png" alt="UniTS-1" width="500">
18+
</p>
19+
20+
## Setups
21+
22+
### 1. Requirements
23+
Install Pytorch2.0+ and the required packages.
24+
```
25+
pip install -r requirements.txt
26+
```
27+
28+
### 2. Prepare data
29+
```
30+
bash download_data_all.sh
31+
```
32+
Datasets configs for different multi-task settings are shown in `.ymal` files of the `data_provider` folder.
33+
34+
By default, all experiments follow the multi-task setting where one UniTS model is jointly trained on mulitple datasets.
35+
36+
### 3. Train and evaluate model
37+
38+
#### 1. Multi-task learning on forecasting and classification tasks:
39+
40+
- Pretraining + Prompt learning
41+
```
42+
bash ./scripts/pretrain_prompt_learning/UniTS_pretrain_x128.sh
43+
```
44+
45+
- Supervised learning
46+
```
47+
bash ./scripts/supervised_learning/UniTS_supervised.sh
48+
```
49+
50+
#### 2. Few-shot transfer learning on new forecasting and classification tasks:
51+
52+
**Note: Please follow the instruction in following training scripts to get the pretrained ckpt first.**
53+
54+
- Finetuning
55+
```
56+
# please set the pretrianed model path in the script.
57+
bash ./scripts/few_shot_newdata/UniTS_finetune_few_shot_newdata_pct20.sh
58+
```
59+
60+
- Prompt tuning
61+
```
62+
# please set the pretrianed model path in the script.
63+
bash ./scripts/few_shot_newdata/UniTS_finetune_few_shot_newdata_pct20.sh
64+
```
65+
66+
#### 3. Few-shot transfer learning on anomaly detection tasks:
67+
- Finetuning
68+
```
69+
# please set the pretrianed model path in the script.
70+
bash ./scripts/few_shot_anomaly_detection/UniTS_finetune_few_shot_anomaly_detection.sh
71+
```
72+
- Prompt tuning
73+
```
74+
# please set the pretrianed model path in the script.
75+
bash ./scripts/few_shot_anomaly_detection/UniTS_prompt_tuning_few_shot_anomaly_detection.sh
76+
```
77+
78+
#### 4. Few-shot transfer learning on imputation tasks:
79+
- Finetuning
80+
```
81+
# please set the pretrianed model path in the script.
82+
bash ./scripts/few_shot_imputation/UniTS_finetune_few_shot_imputation_mask050.sh
83+
```
84+
85+
- Prompt tuning
86+
```
87+
# please set the pretrianed model path in the script.
88+
bash ./scripts/few_shot_imputation/UniTS_prompt_tuning_few_shot_imputation_mask050.sh
89+
```
90+
91+
#### 5. Zero-shot learning on new forecasting length:
92+
```
93+
# please set the pretrianed model path in the script.
94+
bash ./scripts/zero_shot/UniTS_forecast_new_length_unify.sh
95+
```
96+
97+
#### 6. Zero-shot learning on new forecasting datasets:
98+
```
99+
# A special verison of UniTS with shared prompt/mask tokens needs to be trained for this setting.
100+
bash ./scripts/zero_shot/UniTS_zeroshot_newdata.sh
101+
```
102+
103+
## Use UniTS on your own data.
104+
UniTS is a highly flexible unified time series model, supporting tasks such as forecasting, classification, imputation, and anomaly detection with a single shared model and shared weights. We provide a [Tutorial](Tutorial.md) to assist you in using your own data with UniTS.
105+
106+
107+
## Citation
108+
109+
```
110+
@article{gao2024building,
111+
title={UniTS: Building a Unified Time Series Model},
112+
author={Gao, Shanghua and Koker, Teddy and Queen, Owen and Hartvigsen, Thomas and Tsiligkaridis, Theodoros and Zitnik, Marinka},
113+
journal={arXiv},
114+
url={},
115+
year={2024}
116+
}
117+
```
118+
119+
## Acknowledgement
120+
This codebase is built based on the [Time-Series-Library](https://github.com/thuml/Time-Series-Library). Thanks!
121+
122+
## Disclaimer
123+
124+
DISTRIBUTION STATEMENT: Approved for public release. Distribution is unlimited.
125+
126+
This material is based upon work supported by the Under Secretary of Defense for Research and Engineering under Air Force Contract No. FA8702-15-D-0001. Any opinions, findings, conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the Under Secretary of Defense for Research and Engineering.
127+
128+
© 2024 Massachusetts Institute of Technology.
129+
130+
Subject to FAR52.227-11 Patent Rights - Ownership by the contractor (May 2014)
131+
132+
The software/firmware is provided to you on an As-Is basis
133+
134+
Delivered to the U.S. Government with Unlimited Rights, as defined in DFARS Part 252.227-7013 or 7014 (Feb 2014). Notwithstanding any copyright notice, U.S. Government rights in this work are defined by DFARS 252.227-7013 or DFARS 252.227-7014 as detailed above. Use of this work other than as specifically authorized by the U.S. Government may violate any copyrights that exist in this work.

0 commit comments

Comments
 (0)