Skip to content

Conversation

Mpasha17
Copy link
Contributor

This PR is about the start.sh script by adding an interactive model provider selection feature. Users can now choose between running Morphik with local models via Ollama or cloud models via OpenAI/Anthropic.

Changes

  • Added interactive prompt for model provider selection
  • Implemented Ollama setup with appropriate model configuration
  • Added OpenAI/Anthropic API key collection and validation
  • Updated environment file handling for API keys

Screenshots

Screenshot 2025-06-17 at 12 01 32 PM

Screenshot 2025-06-17 at 1 02 32 PM

Copy link

jazzberry-ai bot commented Jun 17, 2025

Bug Report

Name Severity Example test case Description
Unnecessary API Key Prompts Low Choose option 2 (OpenAI/Anthropic) during setup, and observe prompts for both API keys even if only one provider is intended. The script prompts for both OpenAI and Anthropic API keys regardless of which provider the user intends to use. It should only prompt for the key of the provider the user wants to use.
Unconditional API Key Removal Low Have OPENAI_API_KEY and/or ANTHROPIC_API_KEY defined in the .env file. Run start.sh and choose option 2. The script removes any existing OPENAI_API_KEY and ANTHROPIC_API_KEY entries from the .env file before potentially adding a new entry. This might lead to unexpected behavior if the user had previously set different keys or values.

Comments? Email us.

@rastafioul
Copy link

rastafioul commented Jun 25, 2025

when choosing ollama local model ,according to this (from morphik.toml):
Ollama models (modify api_base based on your deployment)

Should there be 2 choices: local (Morphik in Docker, Ollama local) and ollama local in docker (Both in Docker)?

Plus, would it be possible to choose the models from a list for the agent and the completion? If not, would it be interesting to do it from the UI (when using UI...)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants