Skip to content

Commit 395bdec

Browse files
authored
Change to Introduction
1 parent 3a27332 commit 395bdec

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

_posts/2025-10-09-blackwell-inferencemax.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ title: "SemiAnalysis InferenceMAX: vLLM and NVIDIA Accelerate Blackwell Inferenc
44
author: "vLLM Team"
55
---
66

7-
### SemiAnalysis InferenceMAX: vLLM and NVIDIA Accelerate Blackwell Inference
7+
### Introduction
88

99
Over the past several months, we’ve been collaborating closely with NVIDIA to unlock the full potential of their latest NVIDIA Blackwell GPU architecture (B200/GB200) for large language model inference using vLLM. Blackwell GPUs introduce a new class of performance and efficiency improvements, such as increased memory bandwidth and native FP4 tensor cores, opening exciting opportunities to accelerate inference workloads.
1010

0 commit comments

Comments
 (0)