-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Open
Labels
enhancementNew feature or requestNew feature or requestgood first issueGood for newcomersGood for newcomers
Description
🚀 Describe the new functionality needed
Logprobs are an important tool to help users understand token prediction outcomes. They are currently missing from llama stack responses.
Requirements
- The Responses Create requests has a top_logprobs field, which gets translated or dropped based on the the inference provider.
- The response object contains top_lobprobs which is equal to the output value from the provider, or None.
💡 Why is this needed? What if we don't build it?
Users that have max_logprobs set will get bad request errors.
Other thoughts
No response
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or requestgood first issueGood for newcomersGood for newcomers