Skip to content

[bug] 500 status code for Llamaguard-7B #1276

@iamvineettiwari

Description

@iamvineettiwari

Describe the bug
Getting ValueError: Failed to get valid response from Llamaguard-7B model. Status: None. Detail: Unknown error on Llama Guard.

Following are the configurations I am using:

remote-inferencing is enabled

config.py for guard:

llama_guard = Guard(
    name="llama_guard"
).use(
    LlamaGuard7B,
    policies=[
        LlamaGuard7B.POLICY__NO_ILLEGAL_DRUGS,
        LlamaGuard7B.POLICY__NO_VIOLENCE_HATE,
        LlamaGuard7B.POLICY__NO_SEXUAL_CONTENT,
        LlamaGuard7B.POLICY__NO_CRIMINAL_PLANNING,
        LlamaGuard7B.POLICY__NO_GUNS_AND_ILLEGAL_WEAPONS,
        LlamaGuard7B.POLICY__NO_ENOURAGE_SELF_HARM,
    ],
    on_fail=OnFailAction.EXCEPTION,
)

To Reproduce
Steps to reproduce the behavior:

  1. Run with the provided config

Expected behavior
A valid response for the llmOutput message

Library version:
Version - 0.6.6

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions