@@ -11,7 +11,13 @@ git clone https://github.com/Lightning-AI/lightning.git
11
11
docker image build -t pytorch-lightning:latest -f dockers/base-cuda/Dockerfile .
12
12
13
13
# build with specific arguments
14
- docker image build -t pytorch-lightning:base-cuda-py3.9-torch1.13-cuda11.7.1 -f dockers/base-cuda/Dockerfile --build-arg PYTHON_VERSION=3.9 --build-arg PYTORCH_VERSION=1.13 --build-arg CUDA_VERSION=11.7.1 .
14
+ docker image build \
15
+ -t pytorch-lightning:base-cuda12.6.3-py3.10-torch2.8 \
16
+ -f dockers/base-cuda/Dockerfile \
17
+ --build-arg PYTHON_VERSION=3.10 \
18
+ --build-arg PYTORCH_VERSION=2.8 \
19
+ --build-arg CUDA_VERSION=12.6.3 \
20
+ .
15
21
```
16
22
17
23
To run your docker use
@@ -45,18 +51,18 @@ sudo systemctl restart docker
45
51
and later run the docker image with ` --gpus all ` . For example,
46
52
47
53
```
48
- docker run --rm -it --gpus all pytorchlightning/pytorch_lightning:base-cuda -py3.9-torch1.13-cuda11.7.1
54
+ docker run --rm -it --gpus all pytorchlightning/pytorch_lightning:base-cuda12.6.3 -py3.10-torch2.8
49
55
```
50
56
51
57
## Run Jupyter server
52
58
53
59
1 . Build the docker image:
54
60
``` bash
55
- docker image build -t pytorch-lightning:v1.6.5 -f dockers/nvidia/Dockerfile --build-arg LIGHTNING_VERSION=1.6.5 .
61
+ docker image build -t pytorch-lightning:v2.5.1 -f dockers/nvidia/Dockerfile --build-arg LIGHTNING_VERSION=2.5.1 .
56
62
```
57
63
1 . start the server and map ports:
58
64
``` bash
59
- docker run --rm -it --gpus=all -p 8888:8888 pytorch-lightning:v1.6.5
65
+ docker run --rm -it --gpus=all -p 8888:8888 pytorch-lightning:v2.5.1
60
66
```
61
67
1 . Connect in local browser:
62
68
- copy the generated path e.g. ` http://hostname:8888/?token=0719fa7e1729778b0cec363541a608d5003e26d4910983c6 `
0 commit comments