You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/lab-1/README.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,22 +4,22 @@ description: Set up AnythingLLM to start using an LLM locally
4
4
logo: images/ibm-blue-background.png
5
5
---
6
6
7
-
Now that you've got [AnythingLLM installed](../pre-work/README.md#anythingllm), we need to configure it with `ollama`. The following screenshots are taken from a Mac, but the gist of this should be the same on Windows and Linux.
7
+
With [AnythingLLM installed](../pre-work/README.md#anythingllm), open the desktop application to configure it with `ollama`. The following screenshots are taken from a Mac, but this should be similar on Windows and Linux.
8
8
9
-
First, if you haven't already, download the Granite 3.1 model. Open up a terminal and run the following command:
9
+
First, if you haven't already, download the Granite 3.1 model. Make sure that `ollama` is running in the background (you may have to run `ollama serve` in its own terminal depending on how you installed it) and in another terminal run the following command:
10
10
11
11
```bash
12
12
ollama pull granite3.1-dense:8b
13
13
```
14
14
15
15
!!! note
16
-
The download may take a few minutes depending on your internet connection. In the meantime, you can check out information about model we're using [here](https://ollama.com/library/granite3.1-dense). Check out how many languages it supports and take note of its capabilities. It'll help you decide what tasks you might want to use it for.
16
+
The download may take a few minutes depending on your internet connection. In the meantime, you can check out information about model we're using [here](https://ollama.com/library/granite3.1-dense). Check out how many languages it supports and take note of its capabilities. It'll help you decide what tasks you might want to use it for in the future.
17
17
18
18
Either click on the *Get Started* button or open up settings (the 🔧 button). For now, we are going to configure the global settings for `ollama` but you can always change it in the future.
Click on the *LLM* section, and select **Ollama** as the LLM Provider. Select the `granite3-dense:8b` model you downloaded. You'd be able to see all the models you have access to through `ollama` here.
22
+
Click on the *LLM* section, and select **Ollama** as the LLM Provider. Select the `granite3.1-dense:8b` model you downloaded. You'd be able to see all the models you have access to through `ollama` here.
Copy file name to clipboardExpand all lines: docs/pre-work/README.md
+5-5Lines changed: 5 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -57,7 +57,7 @@ brew install ollama
57
57
```
58
58
59
59
!!! note
60
-
You can save time by starting the model download used for the lab in the background by running `ollama pull granite3.1-dense:8b` in its own terminal.
60
+
You can save time by starting the model download used for the lab in the background by running `ollama pull granite3.1-dense:8b` in its own terminal. You might have to run `ollama serve` first depending on how you installed it.
61
61
62
62
## Installing Visual Studio Code
63
63
@@ -71,6 +71,10 @@ You can download and install VSCode from their [website](https://code.visualstud
71
71
Download and install the IDE of your choice [here](https://www.jetbrains.com/ides/#choose-your-ide).
72
72
If you'll be using `python` (like this workshop does), pick [PyCharm](https://www.jetbrains.com/pycharm/).
73
73
74
+
## Installing Continue
75
+
76
+
Choose your IDE on their [website](https://www.continue.dev/) and install the extension.
77
+
74
78
## Installing AnythingLLM
75
79
76
80
Download and install it from their [website](https://anythingllm.com/desktop) based on your operating system. We'll configure it later in the workshop.
@@ -95,7 +99,3 @@ open-webui serve
95
99
Now that you have all of the tools you need, let's start building our local AI co-pilot.
96
100
97
101
**Head over to [Lab 1](/docs/lab-1/README.md) if you have AnythingLLM or [Lab 1.5](/docs/lab-1.5/README.md) for Open-WebUI.**
98
-
99
-
## Installing Continue
100
-
101
-
Choose your IDE on their [website](https://www.continue.dev/) and install the extension.
0 commit comments