@@ -17,6 +17,8 @@ Next, navigate to the repo's root directory:
17
17
cd Torch-TensorRT
18
18
```
19
19
20
+ ### a. Using the NGC PyTorch container
21
+
20
22
At this point, we recommend pulling the [ PyTorch container] ( https://catalog.ngc.nvidia.com/orgs/nvidia/containers/pytorch )
21
23
from [ NVIDIA GPU Cloud] ( https://catalog.ngc.nvidia.com/ ) as follows:
22
24
@@ -29,12 +31,6 @@ where ```yy``` indicates the last two numbers of a calendar year, and
29
31
``` mm ``` indicates the month in two-digit numerical form, if you wish
30
32
to pull a different version of the container.
31
33
32
- Alternatively, to build the container from source, run
33
-
34
- ```
35
- docker build -t torch_tensorrt -f ./docker/Dockerfile .
36
- ```
37
-
38
34
The NGC PyTorch container ships with the Torch-TensorRT tutorial notebooks.
39
35
Therefore, you can run the container and the notebooks therein without
40
36
mounting the repo to the container. To do so, run
@@ -50,12 +46,22 @@ If, however, you wish for your work in the notebooks to persist, use the
50
46
docker run --gpus=all --rm -it -v $PWD:/Torch-TensorRT --net=host --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 nvcr.io/nvidia/pytorch:21.12-py3 bash
51
47
```
52
48
53
- If you're using a container built from source, run this instead:
49
+ ### b. Building a Torch-TensorRT container from source
50
+
51
+ Alternatively, to build the container from source, run
52
+
53
+ ```
54
+ docker build -t torch_tensorrt -f ./docker/Dockerfile .
55
+ ```
56
+
57
+ To run this container, enter the following command:
54
58
55
59
```
56
60
docker run --gpus=all --rm -it -v $PWD:/Torch-TensorRT --net=host --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 torch_tensorrt:latest bash
57
61
```
58
62
63
+ ### c. Running the notebooks inside the container
64
+
59
65
Within the docker interactive bash session, proceed to the notebooks.
60
66
To use the notebooks which ship with the container, run
61
67
0 commit comments