Skip to content

Commit e3f77c1

Browse files
authored
fix: Update inference recorder to handle both Ollama and OpenAI model (#3470)
- Handle Ollama format where models are nested under response['body']['models'] - Fall back to OpenAI format where models are directly in response['body'] Closes: #3457 Signed-off-by: Derek Higgins <[email protected]>
1 parent 142a38d commit e3f77c1

File tree

1 file changed

+6
-1
lines changed

1 file changed

+6
-1
lines changed

llama_stack/testing/inference_recorder.py

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -203,7 +203,12 @@ def _extract_model_identifiers():
203203
- '/v1/models' (OpenAI): response body is: [ { id: ... }, ... ]
204204
Returns a list of unique identifiers or None if structure doesn't match.
205205
"""
206-
items = response["body"]
206+
if "models" in response["body"]:
207+
# ollama
208+
items = response["body"]["models"]
209+
else:
210+
# openai
211+
items = response["body"]
207212
idents = [m.model if endpoint == "/api/tags" else m.id for m in items]
208213
return sorted(set(idents))
209214

0 commit comments

Comments
 (0)