Skip to content
This repository was archived by the owner on Jul 2, 2021. It is now read-only.

possible bug in the way that mIoU is computed #950

@seyeeet

Description

@seyeeet

I notice that the results for miou does not match with miou that I manually compute.
here is an example, lets say pres and labels are two lists including the predictions and gt data
I can compute the confusion matrix via chainercv.evaluations.calc_semantic_segmentation_confusion I also can compute the miou via chainercv.evaluations.eval_semantic_segmentation(preds, labels)

the miou based on confusion matrix can be computed as np.nanmean(np.diag(confusion) / (confusion.sum(axis=1) + confusion.sum(axis=0) - np.diag(confusion))) and these results dont match with np.nanmean(chainercv.evaluations.eval_semantic_segmentation(preds, labels)['iou'])

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions