@@ -11,23 +11,73 @@ First, clone the repository:
11
11
git clone https://github.com/NVIDIA/Torch-TensorRT
12
12
```
13
13
14
- Next, build the NVIDIA Torch-TensorRT container (from repo root) :
14
+ Next, navigate to the repo's root directory :
15
15
16
16
```
17
17
cd Torch-TensorRT
18
+ ```
19
+
20
+ ### a. Using the NGC PyTorch container
21
+
22
+ At this point, we recommend pulling the [ PyTorch container] ( https://catalog.ngc.nvidia.com/orgs/nvidia/containers/pytorch )
23
+ from [ NVIDIA GPU Cloud] ( https://catalog.ngc.nvidia.com/ ) as follows:
24
+
25
+ ```
26
+ docker pull nvcr.io/nvidia/pytorch:21.12-py3
27
+ ```
28
+
29
+ Replace ``` 21.12 ``` with a different string in the form ``` yy.mm ``` ,
30
+ where ``` yy ``` indicates the last two numbers of a calendar year, and
31
+ ``` mm ``` indicates the month in two-digit numerical form, if you wish
32
+ to pull a different version of the container.
33
+
34
+ The NGC PyTorch container ships with the Torch-TensorRT tutorial notebooks.
35
+ Therefore, you can run the container and the notebooks therein without
36
+ mounting the repo to the container. To do so, run
37
+
38
+ ```
39
+ docker run --gpus=all --rm -it --net=host --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 nvcr.io/nvidia/pytorch:21.12-py3 bash
40
+ ```
41
+
42
+ If, however, you wish for your work in the notebooks to persist, use the
43
+ ``` -v ``` flag to mount the repo to the container as follows:
44
+
45
+ ```
46
+ docker run --gpus=all --rm -it -v $PWD:/Torch-TensorRT --net=host --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 nvcr.io/nvidia/pytorch:21.12-py3 bash
47
+ ```
48
+
49
+ ### b. Building a Torch-TensorRT container from source
50
+
51
+ Alternatively, to build the container from source, run
52
+
53
+ ```
18
54
docker build -t torch_tensorrt -f ./docker/Dockerfile .
19
55
```
20
56
21
- Then launch the container with:
57
+ To run this container, enter the following command:
58
+
59
+ ```
60
+ docker run --gpus=all --rm -it -v $PWD:/Torch-TensorRT --net=host --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 torch_tensorrt:latest bash
61
+ ```
62
+
63
+ ### c. Running the notebooks inside the container
64
+
65
+ Within the docker interactive bash session, proceed to the notebooks.
66
+ To use the notebooks which ship with the container, run
22
67
23
68
```
24
- docker run --gpus=all --rm -it -v $PWD:/Torch-TensorRT --net=host --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 torch_tensorrt bash
69
+ cd /workspace/examples/ torch_tensorrt/notebooks
25
70
```
26
71
27
- Within the docker interactive bash session, start Jupyter with
72
+ If, however, you mounted the repo to the container, run
28
73
29
74
```
30
75
cd /Torch-TensorRT/notebooks
76
+ ```
77
+
78
+ Once you have entered the appropriate ``` notebooks ``` directory, start Jupyter with
79
+
80
+ ```
31
81
jupyter notebook --allow-root --ip 0.0.0.0 --port 8888
32
82
```
33
83
0 commit comments