-
Notifications
You must be signed in to change notification settings - Fork 485
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
Getting ValueError: Failed to get valid response from Llamaguard-7B model. Status: None. Detail: Unknown error on Llama Guard.
Following are the configurations I am using:
remote-inferencing is enabled
config.py for guard:
llama_guard = Guard(
name="llama_guard"
).use(
LlamaGuard7B,
policies=[
LlamaGuard7B.POLICY__NO_ILLEGAL_DRUGS,
LlamaGuard7B.POLICY__NO_VIOLENCE_HATE,
LlamaGuard7B.POLICY__NO_SEXUAL_CONTENT,
LlamaGuard7B.POLICY__NO_CRIMINAL_PLANNING,
LlamaGuard7B.POLICY__NO_GUNS_AND_ILLEGAL_WEAPONS,
LlamaGuard7B.POLICY__NO_ENOURAGE_SELF_HARM,
],
on_fail=OnFailAction.EXCEPTION,
)
To Reproduce
Steps to reproduce the behavior:
- Run with the provided config
Expected behavior
A valid response for the llmOutput message
Library version:
Version - 0.6.6
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working