Skip to content

Commit 0ce93e9

Browse files
committed
Fix for 'LLM Provider NOT provided'
1 parent 59956e0 commit 0ce93e9

File tree

1 file changed

+1
-1
lines changed
  • interpreter/terminal_interface/profiles/defaults

1 file changed

+1
-1
lines changed

interpreter/terminal_interface/profiles/defaults/local.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -383,7 +383,7 @@ def list_ollama_models():
383383
print("Model process terminated.")
384384

385385
# Set flags for Llamafile to work with interpreter
386-
interpreter.llm.model = "local"
386+
interpreter.llm.model = "openai/local"
387387
interpreter.llm.temperature = 0
388388
interpreter.llm.api_base = "http://localhost:8080/v1"
389389
interpreter.llm.supports_functions = False

0 commit comments

Comments
 (0)