-
Notifications
You must be signed in to change notification settings - Fork 182
Description
Hello, I have some questions on the best practices for loading from HBM into SBUF. I've been trying to implement a CNN using nki, which requires shifting and flattening a higher dimensional tensor into one that is 2D in order to then run matmul applications.
For example, I have been trying to shift and then flatten the input in hbm before loading it into sbuf, but have been running into issues.
x_shift_flat = x_shift.reshape((in_channels, out_pool_height * out_pool_width))
NotImplementedError: reshape not implemented for base tensor
I think this is because my x_shift is only a reference to X (the original tensor), which is just a tensor view rather than its own copy. For example, when I run:
print(f"x shift base shape: {x_shift.base.shape}, x shift shape {x_shift.shape}")
I get this output:
x shift base shape: (4, 128, 32, 16), x shift shape (128, 30, 14)
Since reshapes can only be done in the hbm, I'm assuming I'll need to make a deep copy of the tensor in hbm, but I haven't found a way to do this either? My end goal is to have x_shift_flat in sbuf of shape 128 x 420.
I have been looking through the NKI docs, but I haven't been able to find what I'm looking for, wondering if you'd be able to provide some advice or a tutorial. Thank you.