Skip to content

Switch to AutoModel API #174

@JohnGiorgi

Description

@JohnGiorgi

The PyTorch Transformer library recently added a new AutoModel API, which lets you instantiate one of the many pre-trained transformers that are available (BERT, GPT-2, RoBERTa, etc.).

We should switch from our BERT-specific code to AutoModel. Specifically, this will involve swapping all BertModels, BertConfigs, and BertTokenizers for AutoModel, AutoConfig and AutoTokenizer. In the long run, this will let us seamlessly load pre-trained weights from any of the popular transformer language models.

This should be addressed in two pull requests:

  1. Simply use AutoModel, AutoTokenizer and AutoConfig in place of the BERT specific classes. This won't be enough to use any of the transformers weights as our preprocessing steps are still BERT specific, but it will let us use any of the models that have the same preprocessing steps as BERT (like RoBERTa).
  2. Re-write up our preprocessing steps to make them model agnostic.

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions