Skip to content

[BUG] Trouble with Ollama #703

@sodaken

Description

@sodaken

Is there an existing issue for this?

  • I have searched the existing issues

Vercel Runtime Logs

  • I have checked the Vercel Runtime Logs for errors (if applicable)

Current Behavior

Can't perform search with models loaded through Ollama.

Error Log:
Attempting to fetch models from: http://localhost:3000/config/models.json⁠

Successfully loaded models from URL

Stream execution error: [Error: Unable to connect. Is the computer able to access the url?] {

code: 'ConnectionRefused',

path: 'http://localhost:11434/api/chat',

errno: 0

}

Expected Behavior

Morphic runs search with the configured model

Steps To Reproduce

I'm using Searxng as search engine.

Ollama is running in the background and I've set up models.json as follows:

"id": "Qwen3:4b-instruct",
"name": "Qwen3:4b-instruct",
"provider": "Ollama",
"providerId": "ollama",
"enabled": true,
"toolCallType": "manual",
"toolCallModel": "phi4"

Environment

- OS: MacOS
- Browser: Safari

Anything else?

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions