Skip to content

Commit 663e17b

Browse files
committed
adds tools diagram, and providers
1 parent cc97d48 commit 663e17b

File tree

1 file changed

+13
-4
lines changed

1 file changed

+13
-4
lines changed

units/en/unit2/continue-client.mdx

Lines changed: 13 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ You can install Continue from the VS Code marketplace.
2020

2121
![sidebar vs code demo](https://docs.continue.dev/assets/images/move-to-right-sidebar-b2d315296198e41046fc174d8178f30a.gif)
2222

23-
With Continue configured, we'll move on to setting up Ollama to pull local models.
23+
With Continue configured, we'll move on to setting up Ollama to pull local models.
2424

2525
### Ollama local models
2626

@@ -33,6 +33,14 @@ For example, you can download the [llama 3.1:8b](https://ollama.com/models/llama
3333
```bash
3434
ollama pull llama3.1:8b
3535
```
36+
<Tip>
37+
It is possible
38+
to use other local model provides, like [Llama.cpp](https://docs.continue.dev/customize/model-providers/more/llamacpp), and [LLmstudio](https://docs.continue.dev/customize/model-providers/more/lmstudio) by updating the
39+
model provider in the configuration files below. However, Continue has been
40+
tested with Ollama and it is recommended to use it for the best experience.
41+
42+
Details on all available model providers can be found in the [Continue documentation](https://docs.continue.dev/customize/model-providers).
43+
</Tip>
3644

3745
It is important that we use models that have tool calling as a built-in feature, i.e. Codestral Qwen and Llama 3.1x.
3846

@@ -55,9 +63,8 @@ models:
5563
- edit
5664
```
5765
58-
By default, the max context length is `8192` tokens. This setup includes a larger use of
59-
that context window to perform multiple MCP requests and also allotment for more
60-
tokens will be necessary.
66+
By default, each model has a max context length, in this case it is `128000` tokens. This setup includes a larger use of
67+
that context window to perform multiple MCP requests and needs to be able to handle more tokens.
6168

6269
## How it works
6370

@@ -68,6 +75,8 @@ They are provided to the model as a JSON object with a name and an arguments
6875
schema. For example, a `read_file` tool with a `filepath` argument will give the
6976
model the ability to request the contents of a specific file.
7077

78+
![autonomous agents diagram](https://gist.github.com/user-attachments/assets/c7301fc0-fa5c-4dc4-9955-7ba8a6587b7a)
79+
7180
The following handshake describes how the Agent uses tools:
7281

7382
1. In Agent mode, available tools are sent along with `user` chat requests

0 commit comments

Comments
 (0)