You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: _posts/2025-11-30-vllm-omni.md
+1-2Lines changed: 1 addition & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -66,7 +66,7 @@ vLLM-Omni is evolving rapidly. Our roadmap is focused on expanding model support
66
66
67
67
## **Getting Started**
68
68
69
-
Getting started with vLLM-Omni is straightforward. The initial vllm-omni v0.11.0rc release is built on top of vLLM v0.11.0.
69
+
Getting started with vLLM-Omni is straightforward. The initial vllm-omni v0.11.0rc release is built on top of vLLM v0.11.0.
70
70
71
71
### **Installation**
72
72
@@ -84,7 +84,6 @@ Check out our [examples directory](https://github.com/vllm-project/vllm-omni/tre
84
84
85
85
This is just the beginning for omni-modality serving. We are actively developing support for more architectures and invite the community to help shape the future of vLLM-Omni.
86
86
87
-
@gaohan, update the links after vllm-omni released
0 commit comments