You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/faq.md
+16Lines changed: 16 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,21 @@
1
1
# Frequently Asked Questions
2
2
3
+
## I get an "Unauthorized" error when installing validators from the Guardrails Hub. What should I do?
4
+
5
+
If you see an "Unauthorized" error when installing validators from the Guardrails hub, it means that the API key you are using is not authorized to access the Guardrails hub. It may be unset or expired.
6
+
7
+
To fix this, first generate a new API key from the [Guardrails Hub](https://hub.guardrailsai.com/keys). Then, configure the Guardrails CLI with the new API key.
8
+
9
+
```bash
10
+
guardrails configure
11
+
```
12
+
13
+
There is also a headless option to configure the CLI with the token.
14
+
15
+
```bash
16
+
guardrails configure --token <your_token>
17
+
```
18
+
3
19
## I'm seeing a PromptCallableException when invoking my Guard. What should I do?
Copy file name to clipboardExpand all lines: docs/how_to_guides/using_llms.md
+46Lines changed: 46 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -289,3 +289,49 @@ for chunk in stream_chunk_generator
289
289
## Other LLMs
290
290
291
291
See LiteLLM’s documentation [here](https://docs.litellm.ai/docs/providers) for details on many other llms.
292
+
293
+
## Custom LLM Wrappers
294
+
In case you're using an LLM that isn't natively supported by Guardrails and you don't want to use LiteLLM, you can build a custom LLM API wrapper. In order to use a custom LLM, create a function that accepts a positional argument for the prompt as a string and any other arguments that you want to pass to the LLM API as keyword args. The function should return the output of the LLM API as a string.
295
+
296
+
```python
297
+
from guardrails import Guard
298
+
from guardrails.hub import ProfanityFree
299
+
300
+
# Create a Guard class
301
+
guard = Guard().use(ProfanityFree())
302
+
303
+
# Function that takes the prompt as a string and returns the LLM output as string
304
+
defmy_llm_api(
305
+
prompt: Optional[str] =None,
306
+
*,
307
+
instructions: Optional[str] =None,
308
+
msg_history: Optional[list[dict]] =None,
309
+
**kwargs
310
+
) -> str:
311
+
"""Custom LLM API wrapper.
312
+
313
+
At least one of prompt, instruction or msg_history should be provided.
314
+
315
+
Args:
316
+
prompt (str): The prompt to be passed to the LLM API
317
+
instruction (str): The instruction to be passed to the LLM API
318
+
msg_history (list[dict]): The message history to be passed to the LLM API
319
+
**kwargs: Any additional arguments to be passed to the LLM API
320
+
321
+
Returns:
322
+
str: The output of the LLM API
323
+
"""
324
+
325
+
# Call your LLM API here
326
+
# What you pass to the llm will depend on what arguments it accepts.
0 commit comments