You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
***A PyTorch NLP framework.** Our framework builds directly on [PyTorch](https://pytorch.org/), making it easy to
23
23
train your own models and experiment with new approaches using Flair embeddings and classes.
24
24
25
-
Now at [version 0.9](https://github.com/flairNLP/flair/releases)!
25
+
Now at [version 0.10](https://github.com/flairNLP/flair/releases)!
26
26
27
27
28
28
## Join Us: Open Positions at HU-Berlin!
@@ -155,18 +155,6 @@ If you use the Flair framework for your experiments, please cite [this paper](ht
155
155
}
156
156
```
157
157
158
-
If you use the pooled version of the Flair embeddings (PooledFlairEmbeddings), please cite [this paper](https://www.aclweb.org/anthology/papers/N/N19/N19-1078/):
159
-
160
-
```
161
-
@inproceedings{akbik2019naacl,
162
-
title={Pooled Contextualized Embeddings for Named Entity Recognition},
163
-
author={Akbik, Alan and Bergmann, Tanja and Vollgraf, Roland},
164
-
booktitle = {{NAACL} 2019, 2019 Annual Conference of the North American Chapter of the Association for Computational Linguistics},
165
-
pages = {724–728},
166
-
year = {2019}
167
-
}
168
-
```
169
-
170
158
If you use our new "FLERT" models or approach, please cite [this paper](https://arxiv.org/abs/2011.06993):
171
159
172
160
```
@@ -179,6 +167,17 @@ If you use our new "FLERT" models or approach, please cite [this paper](https://
179
167
primaryClass={cs.CL}
180
168
```
181
169
170
+
If you use our TARS approach for few-shot and zero-shot learning, please cite [this paper](https://kishaloyhalder.github.io/pdfs/tars_coling2020.pdf/):
171
+
172
+
```
173
+
@inproceedings{halder2020coling,
174
+
title={Task Aware Representation of Sentences for Generic Text Classification},
175
+
author={Halder, Kishaloy and Akbik, Alan and Krapac, Josip and Vollgraf, Roland},
176
+
booktitle = {{COLING} 2020, 28th International Conference on Computational Linguistics},
177
+
year = {2020}
178
+
}
179
+
```
180
+
182
181
## Contact
183
182
184
183
Please email your questions or comments to [Alan Akbik](http://alanakbik.github.io/).
@@ -189,30 +188,6 @@ Thanks for your interest in contributing! There are many ways to get involved;
189
188
start with our [contributor guidelines](CONTRIBUTING.md) and then
190
189
check these [open issues](https://github.com/flairNLP/flair/issues) for specific tasks.
191
190
192
-
For contributors looking to get deeper into the API we suggest cloning the repository and checking out the unit
193
-
tests for examples of how to call methods. Nearly all classes and methods are documented, so finding your way around
194
-
the code should hopefully be easy.
195
-
196
-
### Running unit tests locally
197
-
198
-
You need [Pipenv](https://pipenv.readthedocs.io/) for this:
199
-
200
-
```bash
201
-
pipenv install --dev && pipenv shell
202
-
pytest tests/
203
-
```
204
-
205
-
To run integration tests execute:
206
-
```bash
207
-
pytest --runintegration tests/
208
-
```
209
-
The integration tests will train small models.
210
-
Afterwards, the trained model will be loaded for prediction.
211
-
212
-
To also run slow tests, such as loading and using the embeddings provided by flair, you should execute:
mini_batch_chunk_size=1, # remove this parameter to speed up computation if you have a big GPU
191
+
)
200
192
```
201
193
202
194
This will give you state-of-the-art numbers similar to the ones reported
@@ -214,9 +206,6 @@ code below:
214
206
(If you don't have a big GPU to fine-tune transformers, try `DocumentPoolEmbeddings` or `DocumentRNNEmbeddings` instead; sometimes they work just as well!)
215
207
216
208
```python
217
-
import torch
218
-
from torch.optim.lr_scheduler import OneCycleLR
219
-
220
209
from flair.data import Corpus
221
210
from flair.datasets importTREC_6
222
211
from flair.embeddings import TransformerDocumentEmbeddings
@@ -238,18 +227,15 @@ document_embeddings = TransformerDocumentEmbeddings('distilbert-base-uncased', f
0 commit comments