Commit 07d1028
committed
Fix: Handle nn.Module in move_to_device
The move_to_device function was previously only expecting torch.Tensor objects.
This caused an AttributeError when a torch.nn.Module (like a Linear layer)
was passed to it, as modules do not have a `.device` attribute directly
in the same way tensors do.
This commit modifies move_to_device to:
1. Check if the input is an instance of nn.Module.
2. If so, use the module's own `.to(device)` method for device placement.
3. Updates type hints for the function to reflect it can handle
Union[torch.Tensor, torch.nn.Module].
Additionally, a new unit test has been added to test_api.py to verify
that move_to_device correctly handles both torch.Tensor and torch.nn.Module
objects, moving them to the target device (CPU or CUDA) as expected.
This ensures the fix is effective and prevents regressions.1 parent 54e44e6 commit 07d1028
2 files changed
+67
-5
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
15 | 15 | | |
16 | 16 | | |
17 | 17 | | |
18 | | - | |
| 18 | + | |
19 | 19 | | |
20 | 20 | | |
21 | | - | |
22 | | - | |
| 21 | + | |
| 22 | + | |
23 | 23 | | |
| 24 | + | |
| 25 | + | |
| 26 | + | |
24 | 27 | | |
25 | 28 | | |
26 | | - | |
| 29 | + | |
27 | 30 | | |
28 | 31 | | |
29 | 32 | | |
30 | | - | |
| 33 | + | |
| 34 | + | |
| 35 | + | |
| 36 | + | |
31 | 37 | | |
32 | 38 | | |
33 | 39 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
197 | 197 | | |
198 | 198 | | |
199 | 199 | | |
| 200 | + | |
| 201 | + | |
| 202 | + | |
| 203 | + | |
| 204 | + | |
| 205 | + | |
| 206 | + | |
| 207 | + | |
| 208 | + | |
| 209 | + | |
| 210 | + | |
| 211 | + | |
| 212 | + | |
| 213 | + | |
| 214 | + | |
| 215 | + | |
| 216 | + | |
| 217 | + | |
| 218 | + | |
| 219 | + | |
| 220 | + | |
| 221 | + | |
| 222 | + | |
| 223 | + | |
| 224 | + | |
| 225 | + | |
| 226 | + | |
| 227 | + | |
| 228 | + | |
| 229 | + | |
| 230 | + | |
| 231 | + | |
| 232 | + | |
| 233 | + | |
| 234 | + | |
| 235 | + | |
| 236 | + | |
| 237 | + | |
| 238 | + | |
| 239 | + | |
| 240 | + | |
| 241 | + | |
| 242 | + | |
| 243 | + | |
| 244 | + | |
| 245 | + | |
| 246 | + | |
| 247 | + | |
| 248 | + | |
| 249 | + | |
| 250 | + | |
| 251 | + | |
| 252 | + | |
| 253 | + | |
| 254 | + | |
| 255 | + | |
200 | 256 | | |
201 | 257 | | |
0 commit comments