Skip to content

Commit b842e0c

Browse files
authored
Merge pull request #1124 from stackblitz-labs/leex279-patch-readme-changes-v1
docs: update README.md
2 parents e196442 + 840dd59 commit b842e0c

File tree

1 file changed

+5
-3
lines changed

1 file changed

+5
-3
lines changed

README.md

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,12 @@
1-
# bolt.diy (Previously oTToDev)
1+
# bolt.diy (Previously oTToDev)
22
[![bolt.diy: AI-Powered Full-Stack Web Development in the Browser](./public/social_preview_index.jpg)](https://bolt.diy)
33

44
Welcome to bolt.diy, the official open source version of Bolt.new (previously known as oTToDev and bolt.new ANY LLM), which allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI, HuggingFace, DeepSeek, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.
55

6-
Check the [bolt.diy Docs](https://stackblitz-labs.github.io/bolt.diy/) for more information.
6+
-----
7+
Check the [bolt.diy Docs](https://stackblitz-labs.github.io/bolt.diy/) for more offical installation instructions and more informations.
78

9+
-----
810
Also [this pinned post in our community](https://thinktank.ottomator.ai/t/videos-tutorial-helpful-content/3243) has a bunch of incredible resources for running and deploying bolt.diy yourself!
911

1012
We have also launched an experimental agent called the "bolt.diy Expert" that can answer common questions about bolt.diy. Find it here on the [oTTomator Live Agent Studio](https://studio.ottomator.ai/).
@@ -91,7 +93,7 @@ project, please check the [project management guide](./PROJECT.md) to get starte
9193

9294
## Features
9395

94-
- **AI-powered full-stack web development** directly in your browser.
96+
- **AI-powered full-stack web development** for **NodeJS based applications** directly in your browser.
9597
- **Support for multiple LLMs** with an extensible architecture to integrate additional models.
9698
- **Attach images to prompts** for better contextual understanding.
9799
- **Integrated terminal** to view output of LLM-run commands.

0 commit comments

Comments
 (0)