Skip to content

Commit 9395c0f

Browse files
fix: pass stream parameter from Agent to LLM
The Agent class was not passing the stream parameter to the LLM's get_response() method, causing the LLM to always use its default stream=True behavior regardless of the Agent's configuration. This fix ensures that when an Agent is created with stream=False or when chat() is called with stream=False, this preference is properly passed to the LLM. Co-authored-by: Mervin Praison <[email protected]>
1 parent 98e7903 commit 9395c0f

File tree

1 file changed

+2
-1
lines changed
  • src/praisonai-agents/praisonaiagents/agent

1 file changed

+2
-1
lines changed

src/praisonai-agents/praisonaiagents/agent/agent.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1235,7 +1235,8 @@ def chat(self, prompt, temperature=0.2, tools=None, output_json=None, output_pyd
12351235
agent_role=self.role,
12361236
agent_tools=[t.__name__ if hasattr(t, '__name__') else str(t) for t in (tools if tools is not None else self.tools)],
12371237
execute_tool_fn=self.execute_tool, # Pass tool execution function
1238-
reasoning_steps=reasoning_steps
1238+
reasoning_steps=reasoning_steps,
1239+
stream=stream # Pass the stream parameter from chat method
12391240
)
12401241

12411242
self.chat_history.append({"role": "assistant", "content": response_text})

0 commit comments

Comments
 (0)