You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: content/blog/using-structured-outputs-in-vllm.md
+8-8Lines changed: 8 additions & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,7 +13,7 @@ tags:
13
13
---
14
14
Generating predictable and reliable outputs from large language models (LLMs) can be challenging, especially when those outputs need to integrate seamlessly with downstream systems. Structured outputs solve this problem by enforcing specific formats, such as JSON, regex patterns, or even grammars. vLLM supported this since some time ago, but there were no documentation on how to use it, and that´s why I decided to do a contribution and write the [Structured Outputs documentation page](https://docs.vllm.ai/en/latest/usage/structured_outputs.html).
15
15
16
-
###Why Structured Outputs?
16
+
## Why Structured Outputs?
17
17
18
18
LLMs are incredibly powerful, but their outputs can be inconsistent when a specific format is required. Structured outputs address this issue by restricting the model’s generated text to adhere to predefined rules or formats, ensuring:
19
19
@@ -28,7 +28,7 @@ How these tools work? The idea is that we´ll be able to filter the list of poss
28
28
29
29

30
30
31
-
###What is vLLM?
31
+
## What is vLLM?
32
32
33
33
vLLM is a state-of-the-art, open-source inference and serving engine for LLMs. It’s built for performance and simplicity, offering:
34
34
@@ -38,7 +38,7 @@ vLLM is a state-of-the-art, open-source inference and serving engine for LLMs. I
38
38
39
39
These optimizations make vLLM one of the fastest and most versatile engines for production environments.
40
40
41
-
###Structured outputs on vLLM
41
+
## Structured outputs on vLLM
42
42
43
43
vLLM extends the OpenAI API with additional parameters to enable structured outputs. These include:
44
44
@@ -49,7 +49,7 @@ vLLM extends the OpenAI API with additional parameters to enable structured outp
49
49
50
50
Here’s how each works, along with example outputs:
51
51
52
-
####**1. Guided Choice**
52
+
### **1. Guided Choice**
53
53
54
54
Simplest form of structured output, ensuring the response is one of a set of predefined options.
0 commit comments