How to Train SuperSimpleNet in Unsupervised way with a Custom Dataset? #2724
-
Hi,
dataset = Folder(
name = "dataset",
root = "./dataset",
normal_dir = 'OK',
abnormal_dir = 'NG',
train_batch_size = 16,
eval_batch_size = 16,
)
dataset.setup() However, I noticed that the model requires a gt_mask to be present. In superSimplenet/torch_model.py masks = self.downsample_mask(masks, *features.shape[-2:]) but, I don’t have any masks.. but donwsample_mask method require masks.. training code below model = Supersimplenet()
engine = Engine(max_epochs=epoch)
engine.fit(model, datamodule=dataset) Is it mandatory to have a gt_mask even for unsupervised training? If not, how can I properly configure the model to train with only normal data? Thanks in advance! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hi, in case of unsupervised training, the model still expects the ground truth masks, but these should all be empty (i.e., all zero). Similar problem has already occurred in #2552 where I also suggested the fix, but I'm not sure if the fix should be merged as it could lead to unexpected behavior in the presence of anomalous images that have just image-level labels. Although maybe for the time being that could be added with additional explicit chec that all labels are 0. |
Beta Was this translation helpful? Give feedback.
Hi, in case of unsupervised training, the model still expects the ground truth masks, but these should all be empty (i.e., all zero). Similar problem has already occurred in #2552 where I also suggested the fix, but I'm not sure if the fix should be merged as it could lead to unexpected behavior in the presence of anomalous images that have just image-level labels. Although maybe for the time being that could be added with additional explicit chec that all labels are 0.