Is it possible to run the monai bundle model inference on CPU #5955
-
I am trying to run the inference of https://github.com/Project-MONAI/model-zoo/tree/dev/models/brats_mri_segmentation. I have only 12GB GPU, and it is resulting in out-of-memory error. Is it possible to run the inference in CPU ? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
Hi @Bala93 , could you please change this line: |
Beta Was this translation helpful? Give feedback.
-
Hi @yiheng-wang-nv Thanks for your answer. Fyi, I am trying to run the Brats segmentation through Slicer, like this example (https://github.com/Project-MONAI/model-zoo/tree/dev/models/pancreas_ct_dints_segmentation). I tried changing the roi like you suggested, but I was using Task01_Brain tumor (Segmentation decathlon), hence there was a conflict in dimensions. Now, I have downloaded the Brats2019, and each patient has 4 different modalities. May I know what should be the input? Is it like 4 separate files, or single file with stacked volume ? I would really appreciate your help. |
Beta Was this translation helpful? Give feedback.
Hi @Bala93 , could you please change this line:
https://github.com/Project-MONAI/model-zoo/blob/dev/models/brats_mri_segmentation/configs/inference.json#L12
and set the device to cpu?
In addition, I think you can also try to reduce the
roi_size
to small values, such as(120, 120, 80)