-
-
Notifications
You must be signed in to change notification settings - Fork 991
LiteLLM not launching Toolsย #461
Description
When you select in Agents the LLM like "ollama/any_model" or "gemini/gemini-2.5-pro" or "azure/gpt-4o", I understand that it is using LiteLLM, the agent or MCP is not using the tool, and only answering with the assistant (hallucinating) and not getting response from the tools.
But when you create environment variables, it is working.
These variables are working well with Ollama (Windows):
set OPENAI_API_KEY=none
set OPENAI_API_BASE=http://localhost:11434/v1
set OPENAI_BASE_URL=http://localhost:11434/v1
set OPENAI_MODEL_NAME=mistral-small3.1
set MODEL_NAME=mistral-small3.1
BASE CODE
`from praisonaiagents import Agent, Task, PraisonAIAgents
from praisonaiagents.tools import duckduckgo
search_agent = Agent(
name="SearchAgent",
role="Internet Search Specialist",
goal="Perform accurate internet searches and extract relevant information.",
backstory="Expert in finding and organizing internet data.",
tools=[duckduckgo],
self_reflect=False,
llm="mistral-small3.1"
)
search_task = Task(
description="Search for 'AI major news in 2025' and analyze the results.",
expected_output="List of key AI trends with sources.",
agent=search_agent,
name="search_trends"
)
agents = PraisonAIAgents(
agents=[search_agent],
tasks=[search_task],
process="sequential"
)
agents.start()`