When attempting to configure the system to use an Ollama model with a custom base URL, the model fails to be recognized.
model = "ollama-qwen2.5",
api_key = os.getenv("API_KEY"),
api_base = "http://192.168.1.100/api/v1",
temperature = 0.7,
Error received:
📝 Model: ollama-qwen2.5
🎯 Session ID: materialistic-cover
🔧 2 custom commands loaded
Error: Unknown model: ollama-qwen2.5. Please specify a supported model.