Local LLM - Llama4 #2539
Replies: 1 comment
-
Hello @salauioan! I'm here to help you tackle any bugs, answer your questions, and assist you in becoming a contributor. Let's work together on resolving your issue while we wait for a human maintainer. I couldn't find specific information on the difference between Regarding the setup for local LLMs and MCP tools, there isn't detailed information available about specific system prompts or To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I am looking to setup Roo Code to work with local LLM ( Llama4 Maverick 3 Bit quantization). I was able to quickly setup Roo Code to use my local LLM and as first test, I asked to create for me a simple Python program - it creates the Python file and reports successfully saved to workspace. However the file is not on workspace - after some research ( I am new to Roo Code usage) it seems that the code is wrapped around <write_file> </write_file> instead of <write_to_file> </write_to_file>. Once I refine the prompt for a new task to use proper format for write to file, it actually does save it properly. Is this normal ? Do I have to setup any system prompt for my local model to be able to work with Roo Code provided tools ? Same for MCP tools - do I need to provide specific system prompt or .roo rules to be able to use tools ? NOTE: I tried with Cline too and I have the same issue as with Roo Code.
Beta Was this translation helpful? Give feedback.
All reactions