Skip to content

Ollama App1 Request Fails, ResponseError: model is required (status code: 400) #195

@Wayneless

Description

@Wayneless

When using the genai-stack-bot-1 container for conversation, the Streamlit application throws an error with the following traceback:
ResponseError: model is required (status code: 400)
File "/usr/local/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 542, in _run_script
exec(code, module.dict)
File "/app/bot.py", line 182, in
chat_input()
File "/app/bot.py", line 95, in chat_input
result = output_function(
^^^^^^^^^^^^^^^^
File "/app/chains.py", line 120, in generate_llm_output
answer = chain.invoke(
^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 3024, in invoke
input = context.run(step.invoke, input, config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 284, in invoke
self.generate_prompt(
File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 860, in generate_prompt
return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 690, in generate
self._generate_with_cache(
File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 925, in _generate_with_cache
result = self._generate(
^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/langchain_ollama/chat_models.py", line 644, in _generate
final_chunk = self._chat_stream_with_aggregation(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/langchain_ollama/chat_models.py", line 545, in _chat_stream_with_aggregation
for stream_resp in self._create_chat_stream(messages, stop, **kwargs):
File "/usr/local/lib/python3.11/site-packages/langchain_ollama/chat_models.py", line 527, in _create_chat_stream
yield from self._client.chat(
File "/usr/local/lib/python3.11/site-packages/ollama/_client.py", line 168, in inner
raise ResponseError(e.response.text, e.response.status_code) from None

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions