We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 5049d9d commit 7df2aaeCopy full SHA for 7df2aae
README.md
@@ -4,6 +4,16 @@ Our formulation naturally relaxes to the well-studied weighted maximum coverage
4
5
### Beta: Huggingface AutoTokenizer interface
6
7
+Install the beta version (for transformers >= 4):
8
+```
9
+wget "https://github.com/PreferredAI/pcatt/archive/refs/tags/v0.14-pre2.zip"
10
+unzip v0.14-pre2.zip -d pcatt
11
+cd pcatt
12
+pip install -r requirements.txt
13
+pip install transformers
14
+pip install .
15
16
+
17
For "training" either:
18
```
19
from pcatt.hf.greedtok import GreedTok
@@ -91,4 +101,4 @@ Evaluations in [eval_notebook.ipynb](https://github.com/PreferredAI/aoatt/blob/m
91
101
journal={arXiv preprint arXiv:2501.06246},
92
102
url={https://arxiv.org/abs/2501.06246},
93
103
}
94
-```
104
0 commit comments