Skip to content

Commit 65b55d1

Browse files
update design doc (vllm-project#711)
Signed-off-by: hsliu <liuhongsheng4@huawei.com>
1 parent f262191 commit 65b55d1

File tree

3 files changed

+3
-3
lines changed

3 files changed

+3
-3
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ Easy, fast, and cheap omni-modality model serving for everyone
1616

1717
*Latest News* 🔥
1818

19-
- [2026/01] We released [0.12.0rc1](https://github.com/vllm-project/vllm-omni/releases/tag/v0.12.0rc1) - a major RC milestone focused on maturing the diffusion stack, strengthening OpenAI-compatible serving, expanding omni-model coverage, and improving stability across platforms (GPU/NPU/ROCm).
19+
- [2026/01] We released [0.12.0rc1](https://github.com/vllm-project/vllm-omni/releases/tag/v0.12.0rc1) - a major RC milestone focused on maturing the diffusion stack, strengthening OpenAI-compatible serving, expanding omni-model coverage, and improving stability across platforms (GPU/NPU/ROCm), please check our latest [design](https://docs.google.com/presentation/d/1qv4qMW1rKAqDREMXiUDLIgqqHQe7TDPj/edit?usp=sharing&ouid=110473603432222024453&rtpof=true&sd=true).
2020
- [2025/11] vLLM community officially released [vllm-project/vllm-omni](https://github.com/vllm-project/vllm-omni) in order to support omni-modality models serving.
2121

2222
---

docs/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -60,5 +60,5 @@ vLLM-Omni seamlessly supports most popular open-source models on HuggingFace, in
6060

6161
For more information, checkout the following:
6262

63-
- [vllm-omni architecture design and recent roadmaps](https://docs.google.com/presentation/d/1Y7t2Zm3BIISPN-_X_sDpxZMNCWir2rAx/edit?usp=drive_link&ouid=110473603432222024453&rtpof=true&sd=true)
63+
- [vllm-omni architecture design and recent roadmaps](https://docs.google.com/presentation/d/1qv4qMW1rKAqDREMXiUDLIgqqHQe7TDPj/edit?usp=sharing&ouid=110473603432222024453&rtpof=true&sd=true)
6464
- [vllm-omni announcement blogpost](https://blog.vllm.ai/2025/11/30/vllm-omni.html)

docs/contributing/tests/tests_style.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -113,7 +113,7 @@ vllm_omni/ tests/
113113

114114
### Naming Conventions
115115

116-
- **Unit/System Tests**: Use `test_<module_name>.py` format. Example: `omni_llm.py``test_omni_llm.py`
116+
- **Unit Tests**: Use `test_<module_name>.py` format. Example: `omni_llm.py``test_omni_llm.py`
117117

118118
- **E2E Tests**: Place in `tests/e2e/offline_inference/` or `tests/e2e/online_serving/` with descriptive names. Example: `tests/e2e/offline_inference/test_qwen3_omni.py`, `tests/e2e/offline_inference/test_diffusion_model.py`
119119

0 commit comments

Comments
 (0)