Skip to content

Commit 8a5d594

Browse files
committed
fix: failed to process query in cot chat interface
1 parent aeb9970 commit 8a5d594

File tree

1 file changed

+8
-0
lines changed

1 file changed

+8
-0
lines changed

agentic_rag/gradio_app.py

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -177,6 +177,14 @@ def chat(message: str, history: List[List[str]], agent_type: str, use_cot: bool,
177177
response = agent.process_query(message)
178178
print("Query processed successfully")
179179

180+
# Handle string responses from Ollama models
181+
if isinstance(response, str):
182+
response = {
183+
"answer": response,
184+
"reasoning_steps": [response] if use_cot else [],
185+
"context": []
186+
}
187+
180188
# Format response with reasoning steps if CoT is enabled
181189
if use_cot and isinstance(response, dict) and "reasoning_steps" in response:
182190
formatted_response = "🤔 Let me think about this step by step:\n\n"

0 commit comments

Comments
 (0)