We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
2 parents b8ce042 + 383ce4d commit c8cdf9dCopy full SHA for c8cdf9d
README.md
@@ -6,6 +6,8 @@
6
- 🛤️ 2025 Q1 Road Map Released! Join the discussion [here](https://github.com/vllm-project/production-stack/issues/26)!
7
- 🔥 vLLM Production Stack is released! Checkout our [release blogs](https://blog.lmcache.ai/2025-01-21-stack-release) [01-22-2025]
8
9
+## Introduction
10
+
11
**vLLM Production Stack** project provides a reference implementation on how to build an inference stack on top of vLLM, which allows you to:
12
13
- 🚀 Scale from single vLLM instance to distributed vLLM deployment without changing any application code
0 commit comments