The NER log difference between using tok2vec and transformers #9293
-
I am trying to compare my NER training between using tok2vec and transformer pipeline. My data set has 6,650 training documents and 1,700 validation docs. The log report for tok2vec shows that my epoch 0 is about my 6550 documents:
But when using transformer pipeline, it seems that my 0 epoch is less than 2800 documents:
Why is there a difference (in Epoch size)? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
The # column is the number of batches, not the number of documents. Is your batch size different between your two pipelines? |
Beta Was this translation helpful? Give feedback.
The # column is the number of batches, not the number of documents. Is your batch size different between your two pipelines?