Skip to content

Commit 9f909bd

Browse files
Ki-SekiRobinPicard
authored andcommitted
Add SamplingParams to model response example
Updated example to include SamplingParams in response call.
1 parent 0f9f05f commit 9f909bd

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

docs/features/models/vllm_offline.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -44,15 +44,15 @@ For instance:
4444

4545
```python
4646
import outlines
47-
from vllm import LLM
47+
from vllm import LLM, SamplingParams
4848

4949
# Create the model
5050
model = outlines.from_vllm_offline(
5151
LLM("microsoft/Phi-3-mini-4k-instruct")
5252
)
5353

5454
# Call it to generate text
55-
response = model("What's the capital of Latvia?", max_tokens=20)
55+
response = model("What's the capital of Latvia?", sampling_params=SamplingParams(max_tokens=20))
5656
print(response) # 'Riga'
5757
```
5858

0 commit comments

Comments
 (0)