You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I just wanted to mention that in the lecture video in Section 2 - Lecture 32, timestamp 3:28, the following is said:
Okay, so maybe not stack or V stack.
We can just define what dimension we'd like to combine them on.
I wonder if there is a torch v stack towards V stack.
Oh, there is.
And is there a torch stack for horizontal stack?
Here is a hstack.
Beautiful.
So we'll focus on just the plane stack.
If you want to have a look at VH Stack, it'll be quite similar to what we're going to do with Stack.
And same with Stack.
This made me think that setting dim=0 or dim=1 in toch.stack would be equivalent to torch.vstack or torch.hstack.
After making a few tests I saw that this is not the case. toch.stack creates a new dimension and torch.vstack and torch.hstack don't, here are a few examples:
From the results it seems that stack creates a new dimension resulting in a 3D tensor while vstack and hstack preserve the 2 dimensions but increase the 2D tensor shape from (2,2) to either (4,2) or (2,4).
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, I just wanted to mention that in the lecture video in Section 2 - Lecture 32, timestamp 3:28, the following is said:
This made me think that setting
dim=0
ordim=1
intoch.stack
would be equivalent totorch.vstack
ortorch.hstack
.After making a few tests I saw that this is not the case.
toch.stack
creates a new dimension andtorch.vstack
andtorch.hstack
don't, here are a few examples:From the results it seems that
stack
creates a new dimension resulting in a 3D tensor whilevstack
andhstack
preserve the 2 dimensions but increase the 2D tensor shape from (2,2) to either (4,2) or (2,4).The equivalent to
vstack
andhstack
would be torch.cat(tensor, dim=0) or torch.cat(tensor, dim=1)Maybe a quick note could be added to the lecture.
Thanks for the great content!
Beta Was this translation helpful? Give feedback.
All reactions