Skip to content

Error trying to use any non openai model on openrouter #78

@stabilus

Description

@stabilus

OPENAI_API_BASE="https://openrouter.ai/api/v1"
OPENAI_PROXY_MODELS="gpt-4o-mini" works as it's an openai model that gets routed/translated properly

but trying to use x-ai/grok-3-mini-beta on openrouter, the definition should be:
OPENAI_PROXY_MODELS="openrouter/x-ai/grok-3-mini-beta"

yields this:

Final output

{'result': 'Error running crew: litellm.AuthenticationError: AuthenticationError: OpenrouterException - {"error":{"message":"No auth credentials found","code":401}}', 'stack_trace': 'Traceback (most recent call last):\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\litellm\llms\openai_like\chat\handler.py", line 372, in completion\n response = client.post(\n ^^^^^^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\litellm\llms\custom_httpx\http_handler.py", line 553, in post\n raise e\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\litellm\llms\custom_httpx\http_handler.py", line 534, in post\n response.raise_for_status()\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\httpx_models.py", line 763, in raise_for_status\n raise HTTPStatusError(message, request=request, response=self)\nhttpx.HTTPStatusError: Client error '401 Unauthorized' for url 'https://openrouter.ai/api/v1/chat/completions'\nFor more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/401\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\litellm\main.py", line 2237, in completion\n response = openai_like_chat_completion.completion(\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\litellm\llms\openai_like\chat\handler.py", line 378, in completion\n raise OpenAILikeError(\nlitellm.llms.openai_like.common_utils.OpenAILikeError: {"error":{"message":"No auth credentials found","code":401}}\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "D:\AI Software\CrewAI-Studio\app\pg_crew_run.py", line 62, in run_crew\n result = crewai_crew.kickoff(inputs=inputs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\crewai\crew.py", line 646, in kickoff\n result = self._run_sequential_process()\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\crewai\crew.py", line 758, in _run_sequential_process\n return self._execute_tasks(self.tasks)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\crewai\crew.py", line 861, in _execute_tasks\n task_output = task.execute_sync(\n ^^^^^^^^^^^^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\opentelemetry\instrumentation\crewai\instrumentation.py", line 61, in wrapper\n return func(tracer, duration_histogram, token_histogram, wrapped, instance, args, kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\opentelemetry\instrumentation\crewai\instrumentation.py", line 149, in wrap_task_execute\n result = wrapped(*args, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\crewai\task.py", line 328, in execute_sync\n return self._execute_core(agent, context, tools)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\crewai\task.py", line 472, in _execute_core\n raise e # Re-raise the exception after emitting the event\n ^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\crewai\task.py", line 392, in _execute_core\n result = agent.execute_task(\n ^^^^^^^^^^^^^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\opentelemetry\instrumentation\crewai\instrumentation.py", line 61, in wrapper\n return func(tracer, duration_histogram, token_histogram, wrapped, instance, args, kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\opentelemetry\instrumentation\crewai\instrumentation.py", line 108, in wrap_agent_execute_task\n result = wrapped(*args, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\crewai\agent.py", line 269, in execute_task\n raise e\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\crewai\agent.py", line 250, in execute_task\n result = self.agent_executor.invoke(\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\crewai\agents\crew_agent_executor.py", line 123, in invoke\n raise e\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\crewai\agents\crew_agent_executor.py", line 112, in invoke\n formatted_answer = self._invoke_loop()\n ^^^^^^^^^^^^^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\crewai\agents\crew_agent_executor.py", line 208, in _invoke_loop\n raise e\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\crewai\agents\crew_agent_executor.py", line 155, in _invoke_loop\n answer = get_llm_response(\n ^^^^^^^^^^^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\crewai\utilities\agent_utils.py", line 157, in get_llm_response\n raise e\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\crewai\utilities\agent_utils.py", line 148, in get_llm_response\n answer = llm.call(\n ^^^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\opentelemetry\instrumentation\crewai\instrumentation.py", line 61, in wrapper\n return func(tracer, duration_histogram, token_histogram, wrapped, instance, args, kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\opentelemetry\instrumentation\crewai\instrumentation.py", line 165, in wrap_llm_call\n result = wrapped(*args, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\crewai\llm.py", line 794, in call\n return self._handle_non_streaming_response(\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\crewai\llm.py", line 630, in _handle_non_streaming_response\n response = litellm.completion(**params)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\litellm\utils.py", line 1154, in wrapper\n raise e\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\litellm\utils.py", line 1032, in wrapper\n result = original_function(*args, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\litellm\main.py", line 3068, in completion\n raise exception_type(\n ^^^^^^^^^^^^^^^\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\litellm\litellm_core_utils\exception_mapping_utils.py", line 2201, in exception_type\n raise e\n File "D:\AI Software\CrewAI-Studio\venv\Lib\site-packages\litellm\litellm_core_utils\exception_mapping_utils.py", line 2073, in exception_type\n raise AuthenticationError(\nlitellm.exceptions.AuthenticationError: litellm.AuthenticationError: AuthenticationError: OpenrouterException - {"error":{"message":"No auth credentials found","code":401}}\n'}

The core issue is that LiteLLM needs to know explicitly that these are OpenRouter models, and the current code doesn't provide that information.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions