Skip to content

Enable llm_judge detection for '/api/v1/text/generation' endpoint #21

@saichandrapandraju

Description

@saichandrapandraju

Extend judge detection to support /api/v1/text/generation Detector API as the vllm_judge already has a way to mention the prompt that cause the generation of content being evaluated.

Request schema for /api/v1/text/generation:

{
  "prompt": "This is my amazing prompt",
  "generated_text": "Some text generated by an LLM",
  "detector_params": {}
}

Metadata

Metadata

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions