You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/docs/CONTRIBUTING.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@
4
4
5
5
The `DEFAULT_NUM_CTX` environment variable can be used to limit the maximum number of context values used by the qwen2.5-coder model. For example, to limit the context to 24576 values (which uses 32GB of VRAM), set `DEFAULT_NUM_CTX=24576` in your `.env.local` file.
6
6
7
-
First off, thank you for considering contributing to Bolt.new! This fork aims to expand the capabilities of the original project by integrating multiple LLM providers and enhancing functionality. Every contribution helps make Bolt.new a better tool for developers worldwide.
7
+
First off, thank you for considering contributing to Bolt.diy! This fork aims to expand the capabilities of the original project by integrating multiple LLM providers and enhancing functionality. Every contribution helps make Bolt.diy a better tool for developers worldwide.
8
8
9
9
## 📋 Table of Contents
10
10
-[Code of Conduct](#code-of-conduct)
@@ -62,7 +62,7 @@ We're looking for dedicated contributors to help maintain and grow this project.
Copy file name to clipboardExpand all lines: docs/docs/FAQ.md
+7-12Lines changed: 7 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,35 +1,30 @@
1
1
# Frequently Asked Questions (FAQ)
2
2
3
-
## How do I get the best results with oTToDev?
3
+
## How do I get the best results with Bolt.diy?
4
4
5
5
-**Be specific about your stack**:
6
-
Mention the frameworks or libraries you want to use (e.g., Astro, Tailwind, ShadCN) in your initial prompt. This ensures that oTToDev scaffolds the project according to your preferences.
6
+
Mention the frameworks or libraries you want to use (e.g., Astro, Tailwind, ShadCN) in your initial prompt. This ensures that Bolt.diy scaffolds the project according to your preferences.
7
7
8
8
-**Use the enhance prompt icon**:
9
9
Before sending your prompt, click the *enhance* icon to let the AI refine your prompt. You can edit the suggested improvements before submitting.
10
10
11
11
-**Scaffold the basics first, then add features**:
12
-
Ensure the foundational structure of your application is in place before introducing advanced functionality. This helps oTToDev establish a solid base to build on.
12
+
Ensure the foundational structure of your application is in place before introducing advanced functionality. This helps Bolt.diy establish a solid base to build on.
13
13
14
14
-**Batch simple instructions**:
15
15
Combine simple tasks into a single prompt to save time and reduce API credit consumption. For example:
16
16
*"Change the color scheme, add mobile responsiveness, and restart the dev server."*
17
17
18
18
---
19
19
20
-
## How do I contribute to oTToDev?
20
+
## How do I contribute to Bolt.diy?
21
21
22
22
Check out our [Contribution Guide](CONTRIBUTING.md) for more details on how to get involved!
23
23
24
24
---
25
25
26
-
## Do you plan on merging oTToDev back into the official Bolt.new repo?
27
26
28
-
Stay tuned! We’ll share updates on this early next month.
29
-
30
-
---
31
-
32
-
## What are the future plans for oTToDev?
27
+
## What are the future plans for Bolt.diy?
33
28
34
29
Visit our [Roadmap](https://roadmap.sh/r/ottodev-roadmap-2ovzo) for the latest updates.
35
30
New features and improvements are on the way!
@@ -38,13 +33,13 @@ New features and improvements are on the way!
38
33
39
34
## Why are there so many open issues/pull requests?
40
35
41
-
oTToDev began as a small showcase project on @ColeMedin's YouTube channel to explore editing open-source projects with local LLMs. However, it quickly grew into a massive community effort!
36
+
Bolt.diy began as a small showcase project on @ColeMedin's YouTube channel to explore editing open-source projects with local LLMs. However, it quickly grew into a massive community effort!
42
37
43
38
We’re forming a team of maintainers to manage demand and streamline issue resolution. The maintainers are rockstars, and we’re also exploring partnerships to help the project thrive.
44
39
45
40
---
46
41
47
-
## How do local LLMs compare to larger models like Claude 3.5 Sonnet for oTToDev/Bolt.new?
42
+
## How do local LLMs compare to larger models like Claude 3.5 Sonnet for Bolt.diy?
48
43
49
44
While local LLMs are improving rapidly, larger models like GPT-4o, Claude 3.5 Sonnet, and DeepSeek Coder V2 236b still offer the best results for complex applications. Our ongoing focus is to improve prompts, agents, and the platform to better support smaller local LLMs.
Copy file name to clipboardExpand all lines: docs/docs/index.md
+14-14Lines changed: 14 additions & 14 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,28 +1,28 @@
1
-
# Welcome to OTTO Dev
2
-
This fork of Bolt.new (oTToDev) allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI, HuggingFace, DeepSeek, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.
1
+
# Welcome to Bolt DIY
2
+
Bolt.diy allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI, HuggingFace, DeepSeek, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.
3
3
4
-
Join the community for oTToDev!
4
+
Join the community!
5
5
6
6
https://thinktank.ottomator.ai
7
7
8
-
## Whats Bolt.new
8
+
## Whats Bolt.diy
9
9
10
-
Bolt.new is an AI-powered web development agent that allows you to prompt, run, edit, and deploy full-stack applications directly from your browser—no local setup required. If you're here to build your own AI-powered web dev agent using the Bolt open source codebase, [click here to get started!](./CONTRIBUTING.md)
10
+
Bolt.diy is an AI-powered web development agent that allows you to prompt, run, edit, and deploy full-stack applications directly from your browser—no local setup required. If you're here to build your own AI-powered web dev agent using the Bolt open source codebase, [click here to get started!](./CONTRIBUTING.md)
11
11
12
-
## What Makes Bolt.new Different
12
+
## What Makes Bolt.diy Different
13
13
14
-
Claude, v0, etc are incredible- but you can't install packages, run backends, or edit code. That’s where Bolt.new stands out:
14
+
Claude, v0, etc are incredible- but you can't install packages, run backends, or edit code. That’s where Bolt.diy stands out:
15
15
16
-
-**Full-Stack in the Browser**: Bolt.new integrates cutting-edge AI models with an in-browser development environment powered by **StackBlitz’s WebContainers**. This allows you to:
16
+
-**Full-Stack in the Browser**: Bolt.diy integrates cutting-edge AI models with an in-browser development environment powered by **StackBlitz’s WebContainers**. This allows you to:
17
17
- Install and run npm tools and libraries (like Vite, Next.js, and more)
18
18
- Run Node.js servers
19
19
- Interact with third-party APIs
20
20
- Deploy to production from chat
21
21
- Share your work via a URL
22
22
23
-
-**AI with Environment Control**: Unlike traditional dev environments where the AI can only assist in code generation, Bolt.new gives AI models **complete control** over the entire environment including the filesystem, node server, package manager, terminal, and browser console. This empowers AI agents to handle the whole app lifecycle—from creation to deployment.
23
+
-**AI with Environment Control**: Unlike traditional dev environments where the AI can only assist in code generation, Bolt.diy gives AI models **complete control** over the entire environment including the filesystem, node server, package manager, terminal, and browser console. This empowers AI agents to handle the whole app lifecycle—from creation to deployment.
24
24
25
-
Whether you’re an experienced developer, a PM, or a designer, Bolt.new allows you to easily build production-grade full-stack applications.
25
+
Whether you’re an experienced developer, a PM, or a designer, Bolt.diy allows you to easily build production-grade full-stack applications.
26
26
27
27
For developers interested in building their own AI-powered development tools with WebContainers, check out the open-source Bolt codebase in this repo!
28
28
@@ -47,10 +47,10 @@ If you see usr/local/bin in the output then you're good to go.
47
47
3. Clone the repository (if you haven't already) by opening a Terminal window (or CMD with admin permissions) and then typing in this:
3. Rename .env.example to .env.local and add your LLM API keys. You will find this file on a Mac at "[your name]/bold.new-any-llm/.env.example". For Windows and Linux the path will be similar.
53
+
3. Rename .env.example to .env.local and add your LLM API keys. You will find this file on a Mac at "[your name]/bolt.diy/.env.example". For Windows and Linux the path will be similar.
To make new LLMs available to use in this version of Bolt.new, head on over to `app/utils/constants.ts` and find the constant MODEL_LIST. Each element in this array is an object that has the model ID for the name (get this from the provider's API documentation), a label for the frontend model dropdown, and the provider.
153
+
To make new LLMs available to use in this version of Bolt.diy, head on over to `app/utils/constants.ts` and find the constant MODEL_LIST. Each element in this array is an object that has the model ID for the name (get this from the provider's API documentation), a label for the frontend model dropdown, and the provider.
154
154
155
155
By default, Anthropic, OpenAI, Groq, and Ollama are implemented as providers, but the YouTube video for this repo covers how to extend this to work with more providers if you wish!
156
156
@@ -179,7 +179,7 @@ This will start the Remix Vite development server. You will need Google Chrome C
179
179
180
180
## Tips and Tricks
181
181
182
-
Here are some tips to get the most out of Bolt.new:
182
+
Here are some tips to get the most out of Bolt.diy:
183
183
184
184
-**Be specific about your stack**: If you want to use specific frameworks or libraries (like Astro, Tailwind, ShadCN, or any other popular JavaScript framework), mention them in your initial prompt to ensure Bolt scaffolds the project accordingly.
0 commit comments