Skip to content

Commit 00ba5bd

Browse files
committed
version 0.3.1
1 parent f944683 commit 00ba5bd

File tree

19 files changed

+7
-6
lines changed

19 files changed

+7
-6
lines changed

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@
2626

2727
OpenDelta is a toolkit for parameter-efficient tuning methods (we dub it as *delta tuning*), by which users could flexibly assign (or add) a small amount parameters to update while keeping the most paramters frozen. By using OpenDelta, users could easily implement prefix-tuning, adapters, Lora, or any other types of delta tuning with preferred PTMs.
2828

29-
- Our repo is tested on Python 3.=-0 and PyTorch 1.9.0. Lower version may also be supported.
29+
- The last version of OpenDelta is tested on Python==3.8.13, PyTorch==1.12.1, transformers==4.22.2. Other versions are likely to be supported as well. If you encounter bugs when using your own package versions, please raise an issue, we will look into it as soon as possible.
3030

3131
- **A demo of using Opendelta to modify the PLM (E.g., BART).**
3232
![How PLM changes using Delta-tuning](docs/source/imgs/demo.gif)
@@ -162,7 +162,7 @@ delta3.log()
162162
## Verified Default Configurations
163163

164164
- **You can try to use OpenDelta on *any* backbone models based on PyTorch.**
165-
- However, with small chances thatThe interface of the submodules of the backbone model is not supported. Therefore we verified some commonly
165+
- However, with small chances that the interface of the submodules of the backbone model is not supported. Therefore we verified some commonly
166166
used models that OpenDelta are sure to support.
167167

168168
- We will keep testing more and more emerging models.

docs/source/conf.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -31,8 +31,8 @@
3131

3232

3333
# The full version, including alpha/beta/rc tags
34-
release = '0.3.0'
35-
version = "0.3.0"
34+
release = '0.3.1'
35+
version = "0.3.1"
3636

3737
html_theme = 'sphinx_rtd_theme'
3838
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]

docs/source/notes/update.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,7 @@
22

33
## Version 0.3.1
44
- We update [must_try.py](https://github.com/thunlp/OpenDelta/tree/main/examples/unittest/must_try.py) for a simple introduction of the core functionality of OpenDelta.
5+
- Thanks to [Weilin Zhao](https://github.com/Achazwl) We merge a long-developed branch parallel_adapter into the main branch.
56

67

78
## Version 0.3.0
@@ -25,4 +26,4 @@
2526
## Version 0.2.4
2627
### Updates
2728
- examples/examples_seq2seq and examples/examples_text-classification is depreciated and moved to [legacy](https://github.com/thunlp/OpenDelta/tree/main/examples/legacies)
28-
- we provide [examples_prompt](https://github.com/thunlp/OpenDelta/tree/main/examples/examples_prompt), as a cleaner and more general framework, which unifies the delta tuning paradigm and the prompt-tuning paradigm. It is still based on [Huggingface Trainers](https://huggingface.co/docs/transformers/main_classes/trainer). In this example framework, the running pipeline is [a unified script](https://github.com/thunlp/OpenDelta/tree/main/examples/examples_prompt/src), the differences in tasks, models, delta tuning models, and even prompt-tuning paradigms are [more modular and be more independent ](https://github.com/thunlp/OpenDelta/tree/main/examples/examples_prompt/backbones). Please try it out!
29+
- Thanks to [Zhen Zhang](https://github.com/namezhenzhang), we provide [examples_prompt](https://github.com/thunlp/OpenDelta/tree/main/examples/examples_prompt), as a cleaner and more general framework, which unifies the delta tuning paradigm and the prompt-tuning paradigm. It is still based on [Huggingface Trainers](https://huggingface.co/docs/transformers/main_classes/trainer). In this example framework, the running pipeline is [a unified script](https://github.com/thunlp/OpenDelta/tree/main/examples/examples_prompt/src), the differences in tasks, models, delta tuning models, and even prompt-tuning paradigms are [more modular and be more independent ](https://github.com/thunlp/OpenDelta/tree/main/examples/examples_prompt/backbones). Please try it out!

examples/examples_text-classification/configs/parallel_adapter_roberta-base/cola.json renamed to examples/legacies/examples_text-classification/configs/parallel_adapter_roberta-base/cola.json

File renamed without changes.

examples/examples_text-classification/configs/parallel_adapter_roberta-base/mnli.json renamed to examples/legacies/examples_text-classification/configs/parallel_adapter_roberta-base/mnli.json

File renamed without changes.

examples/examples_text-classification/configs/parallel_adapter_roberta-base/mrpc.json renamed to examples/legacies/examples_text-classification/configs/parallel_adapter_roberta-base/mrpc.json

File renamed without changes.

examples/examples_text-classification/configs/parallel_adapter_roberta-base/qnli.json renamed to examples/legacies/examples_text-classification/configs/parallel_adapter_roberta-base/qnli.json

File renamed without changes.

examples/examples_text-classification/configs/parallel_adapter_roberta-base/qqp.json renamed to examples/legacies/examples_text-classification/configs/parallel_adapter_roberta-base/qqp.json

File renamed without changes.

examples/examples_text-classification/configs/parallel_adapter_roberta-base/rte.json renamed to examples/legacies/examples_text-classification/configs/parallel_adapter_roberta-base/rte.json

File renamed without changes.

examples/examples_text-classification/configs/parallel_adapter_roberta-base/sst2.json renamed to examples/legacies/examples_text-classification/configs/parallel_adapter_roberta-base/sst2.json

File renamed without changes.

0 commit comments

Comments
 (0)