You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Sep 4, 2025. It is now read-only.
Copy file name to clipboardExpand all lines: docs/source/dev/dockerfile/dockerfile.rst
+10-10Lines changed: 10 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,19 +2,19 @@ Dockerfile
2
2
====================
3
3
4
4
See `here <https://github.com/vllm-project/vllm/blob/main/Dockerfile>`_ for the main Dockerfile to construct
5
-
the image for running an OpenAI compatible server with vLLM.
5
+
the image for running an OpenAI compatible server with vLLM. More information about deploying with Docker can be found `here <https://docs.vllm.ai/en/stable/serving/deploying_with_docker.html>`_.
6
6
7
-
- Below is a visual representation of the multi-stage Dockerfile. The build graph contains the following nodes:
7
+
Below is a visual representation of the multi-stage Dockerfile. The build graph contains the following nodes:
8
8
9
-
- All build stages
10
-
- The default build target (highlighted in grey)
11
-
- External images (with dashed borders)
9
+
- All build stages
10
+
- The default build target (highlighted in grey)
11
+
- External images (with dashed borders)
12
12
13
-
The edges of the build graph represent:
14
-
15
-
- FROM ... dependencies (with a solid line and a full arrow head)
16
-
- COPY --from=... dependencies (with a dashed line and an empty arrow head)
17
-
- RUN --mount=(.*)from=... dependencies (with a dotted line and an empty diamond arrow head)
13
+
The edges of the build graph represent:
14
+
15
+
- FROM ... dependencies (with a solid line and a full arrow head)
16
+
- COPY --from=... dependencies (with a dashed line and an empty arrow head)
17
+
- RUN --mount=(.*)from=... dependencies (with a dotted line and an empty diamond arrow head)
Copy file name to clipboardExpand all lines: docs/source/serving/deploying_with_docker.rst
+3-4Lines changed: 3 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,9 +3,8 @@
3
3
Deploying with Docker
4
4
============================
5
5
6
-
vLLM offers official docker image for deployment.
7
-
The image can be used to run OpenAI compatible server.
8
-
The image is available on Docker Hub as `vllm/vllm-openai <https://hub.docker.com/r/vllm/vllm-openai/tags>`_.
6
+
vLLM offers an official Docker image for deployment.
7
+
The image can be used to run OpenAI compatible server and is available on Docker Hub as `vllm/vllm-openai <https://hub.docker.com/r/vllm/vllm-openai/tags>`_.
9
8
10
9
.. code-block:: console
11
10
@@ -25,7 +24,7 @@ The image is available on Docker Hub as `vllm/vllm-openai <https://hub.docker.co
25
24
memory to share data between processes under the hood, particularly for tensor parallel inference.
26
25
27
26
28
-
You can build and run vLLM from source via the provided dockerfile. To build vLLM:
27
+
You can build and run vLLM from source via the provided `Dockerfile <https://github.com/vllm-project/vllm/blob/main/Dockerfile>`_. To build vLLM:
0 commit comments