Skip to content
This repository was archived by the owner on Sep 4, 2025. It is now read-only.

Commit e83db9e

Browse files
authored
[Doc] Update docker references (vllm-project#5614)
Signed-off-by: Rafael Vasquez <[email protected]>
1 parent 7868750 commit e83db9e

File tree

2 files changed

+13
-14
lines changed

2 files changed

+13
-14
lines changed

docs/source/dev/dockerfile/dockerfile.rst

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -2,19 +2,19 @@ Dockerfile
22
====================
33

44
See `here <https://github.com/vllm-project/vllm/blob/main/Dockerfile>`_ for the main Dockerfile to construct
5-
the image for running an OpenAI compatible server with vLLM.
5+
the image for running an OpenAI compatible server with vLLM. More information about deploying with Docker can be found `here <https://docs.vllm.ai/en/stable/serving/deploying_with_docker.html>`_.
66

7-
- Below is a visual representation of the multi-stage Dockerfile. The build graph contains the following nodes:
7+
Below is a visual representation of the multi-stage Dockerfile. The build graph contains the following nodes:
88

9-
- All build stages
10-
- The default build target (highlighted in grey)
11-
- External images (with dashed borders)
9+
- All build stages
10+
- The default build target (highlighted in grey)
11+
- External images (with dashed borders)
1212

13-
The edges of the build graph represent:
14-
15-
- FROM ... dependencies (with a solid line and a full arrow head)
16-
- COPY --from=... dependencies (with a dashed line and an empty arrow head)
17-
- RUN --mount=(.*)from=... dependencies (with a dotted line and an empty diamond arrow head)
13+
The edges of the build graph represent:
14+
15+
- FROM ... dependencies (with a solid line and a full arrow head)
16+
- COPY --from=... dependencies (with a dashed line and an empty arrow head)
17+
- RUN --mount=(.*)from=... dependencies (with a dotted line and an empty diamond arrow head)
1818

1919
.. figure:: ../../assets/dev/dockerfile-stages-dependency.png
2020
:alt: query

docs/source/serving/deploying_with_docker.rst

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -3,9 +3,8 @@
33
Deploying with Docker
44
============================
55

6-
vLLM offers official docker image for deployment.
7-
The image can be used to run OpenAI compatible server.
8-
The image is available on Docker Hub as `vllm/vllm-openai <https://hub.docker.com/r/vllm/vllm-openai/tags>`_.
6+
vLLM offers an official Docker image for deployment.
7+
The image can be used to run OpenAI compatible server and is available on Docker Hub as `vllm/vllm-openai <https://hub.docker.com/r/vllm/vllm-openai/tags>`_.
98

109
.. code-block:: console
1110
@@ -25,7 +24,7 @@ The image is available on Docker Hub as `vllm/vllm-openai <https://hub.docker.co
2524
memory to share data between processes under the hood, particularly for tensor parallel inference.
2625

2726

28-
You can build and run vLLM from source via the provided dockerfile. To build vLLM:
27+
You can build and run vLLM from source via the provided `Dockerfile <https://github.com/vllm-project/vllm/blob/main/Dockerfile>`_. To build vLLM:
2928

3029
.. code-block:: console
3130

0 commit comments

Comments
 (0)