Replies: 2 comments 1 reply
-
Hi @tangy5 , Could you please help take a look at this problem? Thanks in advance. |
Beta Was this translation helpful? Give feedback.
0 replies
-
@aalhayali Hi, thanks for the question. Great to see UNETR is used for broader datasets.
|
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone!
I have a quick question regarding an error I'm facing and I'm a bit confused on how to pursue it.
I am working on sagittal MRI images of size (512,512,7) (H,W,Slices). I applied the
Resized
transform and resized the image to (7,256,256), along with other transformations. The resultant original image are as follows:The sizes for the original and transformed images are (2, 1, 7, 512, 512), and (2, 1, 7, 256, 256) respectively. When building the UNETR model, I used (7,256,256) model parameter and I'm getting the following error:
ValueError: patch_size should be smaller than img_size.
I am unsure why this error is being output although I am inputting the input image size generated from the
Resized
transform, as per the documentation img_size (Union[Sequence[int], int]) – dimension of input image.I would appreciate any suggestions or pointers to fix that error!
model code:
model = UNETR( in_channels=1, out_channels=2, img_size=(7,256,256), feature_size=16, norm_name='batch', spatial_dims=3, ).to(device)
Beta Was this translation helpful? Give feedback.
All reactions