Can't get tools to work with local model #3159
-
Hello, I would like to use a local model (via llama.cpp) as part of the agent I am writing. Unfortunately, I am not having any success. It is the simplest add/multiply example -- here is the code:
The output is:
I tried using I cannot afford to pay for OpenAI models. Please advise. Thank you. |
Beta Was this translation helpful? Give feedback.
Answered by
SergioRubio01
Jan 23, 2025
Replies: 2 comments 2 replies
-
Could it be that either:
|
Beta Was this translation helpful? Give feedback.
2 replies
-
Thank you @SergioRubio01 and @magaton |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Correct, in Ollama it shows that phi4 is not capable of using tools. Try using another model with a
tools
tag in itPhi 4: