Skip to content
Discussion options

You must be logged in to vote

The answer is that Spacy doesn't support pretraining/finetuning transformers right now, isn't it.

"The impact of spacy pretrain varies, but it will usually be worth trying if you’re not using a transformer model"

Darn. I suppose I'll re-pretrain my cnn tok2vec component and start a different thread (with an appropriate title) for the errors I had in that vein...

If anyone has words about why this isn't supported -- my domain is highly specific and full of jargon, which I think makes it worth it to finetune a LM even with a transformer, but I would listen to reasons I'm wrong on that -- I would be interested to hear them.

Replies: 1 comment 7 replies

Comment options

You must be logged in to vote
7 replies
@kchalkSGS
Comment options

@polm
Comment options

@kchalkSGS
Comment options

@liuchengwang
Comment options

@polm
Comment options

Answer selected by polm
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feat / transformer Feature: Transformer
3 participants