-
Notifications
You must be signed in to change notification settings - Fork 4.8k
torch.stack #1311
Copy link
Copy link
Open
Description
I tried torch.stack with dim = 0 and dim = 1.
torch.vstack is same as torch,stack with dim=0
torch.hstack just concatenates and extends columns, and and is not the same as torch.stack with dim=1
as mentioned in the tutorial
x = torch.arange(1., 10.)
x, x.shape(tensor([1., 2., 3., 4., 5., 6., 7., 8., 9.]), torch.Size([9]))x_stacked = torch.stack([x, x, x, x], dim=0)
x, x.shape, x_stacked, x_stacked.shape(tensor([7., 2., 3., 4., 5., 6., 7., 8., 9.]),
torch.Size([9]),
tensor([[7., 2., 3., 4., 5., 6., 7., 8., 9.],
[7., 2., 3., 4., 5., 6., 7., 8., 9.],
[7., 2., 3., 4., 5., 6., 7., 8., 9.],
[7., 2., 3., 4., 5., 6., 7., 8., 9.]]),
torch.Size([4, 9]))x_stacked = torch.stack([x, x, x, x], dim=1)
x, x.shape, x_stacked, x_stacked.shape(tensor([7., 2., 3., 4., 5., 6., 7., 8., 9.]),
torch.Size([9]),
tensor([[7., 7., 7., 7.],
[2., 2., 2., 2.],
[3., 3., 3., 3.],
[4., 4., 4., 4.],
[5., 5., 5., 5.],
[6., 6., 6., 6.],
[7., 7., 7., 7.],
[8., 8., 8., 8.],
[9., 9., 9., 9.]]),
torch.Size([9, 4]))x_hstacked = torch.hstack([x, x, x, x])
x, x.shape, x_hstacked, x_hstacked.shape(tensor([7., 2., 3., 4., 5., 6., 7., 8., 9.]),
torch.Size([9]),
tensor([7., 2., 3., 4., 5., 6., 7., 8., 9., 7., 2., 3., 4., 5., 6., 7., 8., 9.,
7., 2., 3., 4., 5., 6., 7., 8., 9., 7., 2., 3., 4., 5., 6., 7., 8., 9.]),
torch.Size([36]))x_vstacked = torch.vstack([x, x, x, x])
x, x.shape, x_vstacked, x_vstacked.shape(tensor([7., 2., 3., 4., 5., 6., 7., 8., 9.]),
torch.Size([9]),
tensor([[7., 2., 3., 4., 5., 6., 7., 8., 9.],
[7., 2., 3., 4., 5., 6., 7., 8., 9.],
[7., 2., 3., 4., 5., 6., 7., 8., 9.],
[7., 2., 3., 4., 5., 6., 7., 8., 9.]]),
torch.Size([4, 9]))Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels