Using Udify with spacy versus spacy's built-in transformer capabilities for custom language (Akkadian) #11568
Replies: 1 comment 1 reply
-
The page linked to suggests weights are provided for Another option I would recommend is contacting the authors of Udify to get their Akkadian training data. If they can't provide the data directly, yet another option is using their model on raw text to produce "silver" training data, that you can use to train a model from scratch with newer software. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I am looking to use transfer learning methods for a custom language not currently supported by spacy (Akkadian). In particular I am currently hoping to take a pre-trained language model like BERT and adapt it as a UD dependency parser on my corpus. I was wondering if it is easier to try to use the last updated version of Udify (https://lindat.mff.cuni.cz/repository/xmlui/handle/11234/1-3042), which claims to have included Akkadian in its training language and which supports tasks like ud dependencies yet was developed for spacy 2, or to work with spacy 3's TransformerListener class applied to BERT or something similar? I tried to install Udify on my machine but encountered many package compatibility difficulties (not to mention I only know how to use spacy 3 and not 2).
On the other hand, the spacy documentation mentions the need to install CUDA to use their methods and I have heard of issues installing/using CUDA on macs (M1 or not).
Beta Was this translation helpful? Give feedback.
All reactions