-
Notifications
You must be signed in to change notification settings - Fork 10.3k
feat(code-template): Addition of Code Template And Setting Up Foundation For AI Tool Calling Support #266
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Looking great @thecodacus, nice work!! |
… with optin feature
|
ollama integration is broken for me somehow. not able to test with local llm |
|
this feature needs a set of git repo as code templates. |
use npx pnpm <your pnpm command>
|
I’ve been thinking about it, almost that we should have a single starter repo registry. Once we get this merged, let’s discuss that sort of repository; we can probably assemble it from a bunch of “awesome” repos that curate these starters. |
|
Is this good to review for a merge you think, or still WIP? No rush |
This has been identified, and @wonderwhy-er has a good provider handling PR that’s also in flight. Should resolve those sorts of issues. |
since the template list is hardcoded at the moment shall we just merge it with as it is or remove the dummy templatelists ? |
|
I am ready for merge, if we are deciding to add templates later |
@coleam00 any thoughts on this ? |
|
Also need a prompt optimisation method to use tool calling for smaller models. If we add something like this will also setup the base for agentic approch |
|
Ouh boy, this is too big... I wrote elsewhere that I am in favour of doing this in smaller steps. We not only need templates. That would need to come before templates. And then templates can come on top of that as its just a way to select a template to start a project from when user does not provide a link. Also, not all models support tool calling. What is magical about Bolt is that it did not rely on that. I am bit afraid that larger non customisable changes would make it hard to work on more foundational issues like file upload in middle of the session. Also, about not dumping things into context. I think such modes should be optional. Most users will not manage context well and will drop using this tool if it forces them to. So its opinionated thing that should be configurable. |
|
Did you test it with models that do not support tool use? Here asked ChatGPT what it thinks: Tool Usage in AI ModelsThe tool usage feature is model-dependent, as not all AI models support tools. Here’s a quick rundown of which models are more likely to support tool usage and which do not: Models Likely to Support Tool Usage:
Models Less Likely to Support Tool Usage:
In general, tool usage tends to be a feature of larger, more advanced models specifically designed or configured for interactive workflows. If you’re implementing tool calls, it’s best to check the model documentation or configuration options in the hosting environment to confirm compatibility. |
|
@wonderwhy-er , yes I know thats the reason I have added the tool toggle switch, and ts turned off by default.. |
|
I wonder if we can detect if current model supports tool use or not. And we would parse response and detect if there was tool use in there. In reality tool use under the bonnet does the same, outputs structured data in json or xml formats. |
|
initially I though of just prompting a fresh model to select a templete from provided list. but then routed my idea to this approach as the codebase is very much tightly integrated with vercel's ai sdk for its core functions and it fits the tool option perfectly that way we can have plain text prompt with small system instructions with very strict output structures. so yes I also think whats how the future path should be |
I am actually not thinking about user managing context but AI managing its context. its like AI does not need all the file content all the time |
code template is I believe is a very essential, as most of the time people would use bolt to build something from ground up. but I totally agree with you on tool calling. plain text parsing is the most desirable approach |
|
I agree with most of what you said.
In my experience having multiple smaller, focused prompts works better than one big one
Yeah, I am thinking about it too, though I was thinking to chunk, embed and and somehow filter whole chat.
I agree, I just would start with "import github repo" or "import folder" as first step I actually want to make two way folder sync work as one of next things I want to do. |
|
I am closing this as I am working on a better solution |
|
@thecodacus Sorry I missed your ping on this earlier! Sounds good - I really appreciate you diving into an even better solution! |
|
Yeah its there #302 , putting it here for who ever stumbles onto this pr |
|
Perfect, thanks @thecodacus! |

Addition of Code Template And Setting Up Foundation For AI Tool Calling Support
Overview
Adds an AI tool calling system that enables template selection, tool execution, and interactive user confirmations. This system integrates with existing chat functionality to provide a seamless experience for template-based project initialization and AI-driven operations.
Key Changes
Tool System
ToolStorefor managing AI operationsMessage System
annotationsToolMessagecomponent for execution statusTemplate Management
Tool Calling Switch
Testing Status
✅ Verified
🔄 Pending
None
❌ Known Issues
Tool Calling is Confusing to the llm with the huge context , need an optimised way to fix this in future
to mitigate that currently tool calling is to only select the template then suppressed via code
Migration
Documentation Needed
Preview
bolt.ai.tool.calling.demo.mp4