Skip to content

Commit 5ebaaf9

Browse files
committed
Update memory.md
1 parent 38e9667 commit 5ebaaf9

File tree

1 file changed

+2
-4
lines changed

1 file changed

+2
-4
lines changed

docfx/articles/memory.md

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -351,7 +351,7 @@ foreach (var tensor in tensors)
351351
}
352352
```
353353

354-
Meanwhile, if you want do write a dataset on your own, you shall notice that data loaders will dispose the tensors got from `GetTensor` after collation. So a dataset like this will not work because the saved tensor is disposed:
354+
Meanwhile, when writing a dataset on your own, it should be noticed that the data loaders will dispose the tensors created in `GetTensor` after collation. So a dataset like this will not work because the saved tensor will be disposed:
355355

356356
```csharp
357357
using TorchSharp;
@@ -377,7 +377,7 @@ class MyDataset : torch.utils.data.Dataset
377377
}
378378
```
379379

380-
Since the actual technique to "catch" the tensors is just a simple dispose scope. So we can write the class like this to avoid the disposal:
380+
Since the actual technique to "catch" the tensors is just a simple dispose scope. So we can write like this to avoid the disposal:
381381

382382
```csharp
383383
class MyDataset : torch.utils.data.Dataset
@@ -395,8 +395,6 @@ class MyDataset : torch.utils.data.Dataset
395395
}
396396
```
397397

398-
```
399-
400398
## Links and resources
401399

402400
These articles might give you ides about techniques to use to analyse memory. The code is in python but generally will translate across:

0 commit comments

Comments
 (0)