Skip to content
Discussion options

You must be logged in to vote

Hello,
If I see it correctly, the machine that you're currently using has only 2GB of memory available for training. This is not suited for training with transformers embeddings. You might get it working by using very small batch sizes and small transformer models, but I wouldn't recommend that.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by shrinidhin
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feat / transformer Feature: Transformer
2 participants