Skip to content

Commit 5068bb5

Browse files
committed
Added #4
1 parent 2c0a8cb commit 5068bb5

File tree

1 file changed

+26
-0
lines changed

1 file changed

+26
-0
lines changed

README.md

Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -168,6 +168,16 @@ If you want to contribute to this list (please do), send me a pull request or co
168168

169169
### Pretrained Language Models
170170

171+
* BERT (Encoder of the transormer)
172+
* [Tensorflow-based ](https://github.com/google-research/bert) Implementation:
173+
* BERT<sub>base</sub>,
174+
BERT<sub>large</sub>
175+
BERT<sub>multilingual</sub>, etc.
176+
* [Torch-based (Higging Face)](https://huggingface.co/models) model implementations:
177+
* XLNet, XmlRoBERTa, etc.
178+
* GPT (Decoder of the transformer)
179+
* [GPT-2](https://huggingface.co/gpt2)
180+
171181
### International Workshops
172182

173183
* SemEval Challenges International Workshop on Semantic Evaluation
@@ -185,6 +195,18 @@ If you want to contribute to this list (please do), send me a pull request or co
185195

186196
### Language Models
187197

198+
* [XLNet: Generalized Autoregressive Pretraining for Language Understanding](https://arxiv.org/pdf/1906.08237.pdf) --
199+
is a generalized autoregressive pretraining method that (1) enables
200+
learning bidirectional contexts by maximizing the expected likelihood over all
201+
permutations of the factorization order and (2) overcomes the limitations of BERT
202+
thanks to its autoregressive formulation
203+
204+
* [How to Fine-Tune BERT for Text Classification?](https://arxiv.org/pdf/1905.05583.pdf) --
205+
authors conduct exhaustive experiments to investigate different fine-tuning methods of
206+
[BERT](https://arxiv.org/pdf/1810.04805.pdf)
207+
(Bidirectional Encoder Representations from Transformers) on text
208+
classification task and provide a general solution for BERT fine-tuning
209+
188210
### Neural Network based Models
189211

190212
* [Convolutional Neural Networks for Sentence Classification](https://arxiv.org/abs/1408.5882) - convolutional neural networks (CNN) trained on top of pre-trained word vectors for sentence-level classification tasks.
@@ -195,8 +217,12 @@ If you want to contribute to this list (please do), send me a pull request or co
195217

196218
* [Simpler is better? Lexicon-based ensemble sentiment classification beats supervised methods](https://www.cs.rpi.edu/~szymansk/papers/C3-ASONAM14.pdf) - lexicon-based ensemble can beat supervised learning.
197219

220+
[Back to Top](#table-of-contents)
221+
198222
## Tutorials
199223

224+
* [GPT2 For Text Classification using Hugging Face Transformers](https://gmihaila.github.io/tutorial_notebooks/gpt2_finetune_classification/) - GPT model application for sentiment analysis task
225+
200226
* [SAS2015](https://github.com/laugustyniak/sas2015) iPython Notebook brief introduction to Sentiment Analysis in Python @ Sentiment Analysis Symposium 2015. Scikit-learn + BoW + SemEval Data.
201227

202228
* [LingPipe Sentiment](http://alias-i.com/lingpipe/demos/tutorial/sentiment/read-me.html) - This tutorial covers assigning sentiment to movie reviews using language models. There are many other approaches to sentiment. One we use fairly often is sentence based sentiment with a logistic regression classifier. Contact us if you need more information. For movie reviews we focus on two types of classification problem: Subjective (opinion) vs. Objective (fact) sentences Positive (favorable) vs. Negative (unfavorable) movie reviews

0 commit comments

Comments
 (0)