Skip to content
Discussion options

You must be logged in to vote

Evaluation metrics for a pipline are saved in meta.json under the performance key. Evaluation (on the dev set) is performed automatically during training, but you can also do it manually (on any data, such as test data) with spacy evaluate. The metrics on the model pages are for the dev set.

Replies: 1 comment 3 replies

Comment options

You must be logged in to vote
3 replies
@vmatter
Comment options

@polm
Comment options

@vmatter
Comment options

Answer selected by vmatter
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
perf / accuracy Performance: accuracy
2 participants