You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
### What this PR does / why we need it?
Add release note for v0.11.0rc0
### Does this PR introduce _any_ user-facing change?
### How was this patch tested?
- vLLM version: v0.11.0rc3
- vLLM main:
vllm-project/vllm@releases/v0.11.0
Signed-off-by: wangxiyuan <[email protected]>
|v0.10.2rc1|Latest release candidate|[QuickStart](https://vllm-ascend.readthedocs.io/en/latest/quick_start.html) and [Installation](https://vllm-ascend.readthedocs.io/en/latest/installation.html) for more details|
55
+
|v0.11.0rc0|Latest release candidate|[QuickStart](https://vllm-ascend.readthedocs.io/en/latest/quick_start.html) and [Installation](https://vllm-ascend.readthedocs.io/en/latest/installation.html) for more details|
56
56
|v0.9.1|Latest stable version|[QuickStart](https://vllm-ascend.readthedocs.io/en/v0.9.1-dev/quick_start.html) and [Installation](https://vllm-ascend.readthedocs.io/en/v0.9.1-dev/installation.html) for more details|
|v0.9.1| 最新正式/稳定版本 |[快速开始](https://vllm-ascend.readthedocs.io/en/v0.9.1-dev/quick_start.html) and [安装指南](https://vllm-ascend.readthedocs.io/en/v0.9.1-dev/installation.html)了解更多|
58
58
59
59
## 贡献
@@ -73,7 +73,7 @@ vllm-ascend有主干分支和开发分支。
73
73
74
74
| 分支 | 状态 | 备注 |
75
75
|------------|------------|---------------------|
76
-
| main | Maintained | 基于vLLM main分支CI看护|
76
+
| main | Maintained | 基于vLLM main分支和vLLM最新版本(v0.11.0)CI看护|
Copy file name to clipboardExpand all lines: docs/source/user_guide/release_notes.md
+24Lines changed: 24 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,29 @@
1
1
# Release note
2
2
3
+
## v0.11.0rc0 - 2025.09.30
4
+
5
+
This is the special release candidate of v0.11.0 for vLLM Ascend. Please follow the [official doc](https://vllm-ascend.readthedocs.io/en/) to get started.
6
+
7
+
### Highlights
8
+
9
+
- DeepSeek V3.2 is supported now. [#3270](https://github.com/vllm-project/vllm-ascend/pull/3270)
10
+
- Qwen3-vl is supported now. [#3103](https://github.com/vllm-project/vllm-ascend/pull/3103)
11
+
12
+
### Core
13
+
14
+
- DeepSeek works with aclgraph now. [#2707](https://github.com/vllm-project/vllm-ascend/pull/2707)
15
+
- MTP works with aclgraph now. [#2932](https://github.com/vllm-project/vllm-ascend/pull/2932)
16
+
- EPLB is supported now. [#2956](https://github.com/vllm-project/vllm-ascend/pull/2956)
17
+
- Mooncacke store kvcache connector is supported now. [#2913](https://github.com/vllm-project/vllm-ascend/pull/2913)
18
+
- CPU offload connector is supported now. [#1659](https://github.com/vllm-project/vllm-ascend/pull/1659)
19
+
20
+
### Other
21
+
22
+
- Qwen3-next is stable now. [#3007](https://github.com/vllm-project/vllm-ascend/pull/3007)
23
+
- Fixed a lot of bugs introduced in v0.10.2 by Qwen3-next. [#2964](https://github.com/vllm-project/vllm-ascend/pull/2964)[#2781](https://github.com/vllm-project/vllm-ascend/pull/2781)[#3070](https://github.com/vllm-project/vllm-ascend/pull/3070)[#3113](https://github.com/vllm-project/vllm-ascend/pull/3113)
24
+
- The LoRA feature is back now. [#3044](https://github.com/vllm-project/vllm-ascend/pull/3044)
25
+
- Eagle3 spec decode method is back now. [#2949](https://github.com/vllm-project/vllm-ascend/pull/2949)
26
+
3
27
## v0.10.2rc1 - 2025.09.16
4
28
5
29
This is the 1st release candidate of v0.10.2 for vLLM Ascend. Please follow the [official doc](https://vllm-ascend.readthedocs.io/en/) to get started.
0 commit comments