Replies: 2 comments 1 reply
-
I want to add that I found out that I could just use the Evaluator for this matter. It works for AUROC but it raises an error for the F1Score metric:
This leads to the aforementioned error: ValueError: Cannot update metric of type <class 'anomalib.metrics.f1_score.F1Score'>. Passed dataclass instance does not have a value for field with name pred_label. I dont really understand where this problem with pred_label is coming from. |
Beta Was this translation helpful? Give feedback.
-
Hello, I believe you need to use
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello.
For more efficient training of various models, I wanted to implement a custom EarlyStopping callback so that the training stops once two threholds (I-AUROC and I-F1Score) are surpassed during validation. Therefore I used:
However I realized that during validation the metrics are not calculated (or logged I guess). Therefore this Callback cannot work.
How do I define what to log to use this type of callback? Or am I on the wrong path entirely?
Beta Was this translation helpful? Give feedback.
All reactions