How to use Accuracy with ignore class? #6890
-
Please see same question on Stack Overflow. When using How to use Accuracy while ignoring some class? Thanks :) |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
It is currently not supported in the accuracy metric, but we have an open PR for implementing that exact feature Lightning-AI/torchmetrics#155 Currently what you can is instead calculate the confusion matrix and then ignore some classes based on that (remember that the true positive/correctly classified are found on the diagonal of the confusion matrix): ignore_index = 3
metric = ConfusionMatrix(num_classes=3)
confmat = metric(preds, target)
confmat = confmat[:2,:2] # remove last column and row corresponding to class 3
acc = confmat.trace() / confmat.sum() |
Beta Was this translation helpful? Give feedback.
It is currently not supported in the accuracy metric, but we have an open PR for implementing that exact feature Lightning-AI/torchmetrics#155
Currently what you can is instead calculate the confusion matrix and then ignore some classes based on that (remember that the true positive/correctly classified are found on the diagonal of the confusion matrix):