Skip to content

Commit 513b3e3

Browse files
committed
clarify prompt parameter
1 parent fa2272a commit 513b3e3

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

docs/how_to_guides/using_llms.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -291,7 +291,7 @@ for chunk in stream_chunk_generator
291291
See LiteLLM’s documentation [here](https://docs.litellm.ai/docs/providers) for details on many other llms.
292292

293293
## Custom LLM Wrappers
294-
In case you're using an LLM that isn't natively supported by Guardrails and you don't want to use LiteLLM, you can build a custom LLM API wrapper. In order to use a custom LLM, create a function that takes accepts a prompt as a string and any other arguments that you want to pass to the LLM API as keyword args. The function should return the output of the LLM API as a string.
294+
In case you're using an LLM that isn't natively supported by Guardrails and you don't want to use LiteLLM, you can build a custom LLM API wrapper. In order to use a custom LLM, create a function that accepts a positional argument for the prompt as a string and any other arguments that you want to pass to the LLM API as keyword args. The function should return the output of the LLM API as a string.
295295

296296
```python
297297
from guardrails import Guard

0 commit comments

Comments
 (0)