Llama Maverick 4 in agent mode #7199
planetf1
started this conversation in
Models + Providers
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Has anyone tried Llama Maverick 4 in agent mode?
My first attempt running failed with
<|python_start|>{"type": "function", "name": "file_glob_search", "parameters": {"pattern": "**/config.yaml"}}<|python_end|>
when trying to run a tool.
So I need to customize the prompts.
There's some info at https://ollama.com/library/llama4:maverick for Ollama, but I am running on a remote inference provider and don't have much insight into the configuration.
Has anyone tried maverick? Or can recommend how to do the prompt customization. I'm slightly confused as to how I can approach this?
Beta Was this translation helpful? Give feedback.
All reactions