Why are UNet up blocks channels "flipped"? #7671
Unanswered
AlejandroBaron
asked this question in
Q&A
Replies: 1 comment 1 reply
-
|
I think you understood correctly. What is your question exactly, could you elaborate? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello I'm checking the UNet implementation and when instantiating the up blocks, I see the input channels are "flipped" from my understanding of UNets, in the "up" side/second half
diffusers/src/diffusers/models/unets/unet_2d.py
Lines 209 to 214 in 08bf754
So, if
block_out_channelsis let's say(32,64,128), then pairs ofinput_channelandoutput_channelwould beinput_channeloutput_channelprev_channelsShouldn't the up half of the
UNetbe bottlenecking the channels (i.e.input_channels>output_channels? I get that then when instantiating each layer the input channels are increased by the skip connection/prev_channelsbut still this looks weird to meEDIT:
To support my theory, I think the way I comment this is how it's done in the StabilityAI repo
https://github.com/Stability-AI/stablediffusion/blob/main/ldm/modules/diffusionmodules/openaimodel.py
Beta Was this translation helpful? Give feedback.
All reactions