You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/lab-1.5/README.md
+10-13Lines changed: 10 additions & 13 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,25 +4,22 @@ description: Set up Open-WebUI to start using an LLM locally
4
4
logo: images/ibm-blue-background.png
5
5
---
6
6
7
-
!!! warning
8
-
This is **optional**. You don't need Open-WebUI if you have AnythingLLM already running.
7
+
Let's start by configuring [Open-WebUI](../pre-work/README.md#installing-open-webui) and `ollama` to talk to one another. The following screenshots will be from a Mac, but this should be similar on Windows and Linux.
9
8
10
-
Now that you have [Open-WebUI installed](../pre-work/README.md#installing-open-webui) let's configure it with `ollama` and Open-WebUI to talk to one another. The following screenshots will be from a Mac, but the gist of this should be the same on Windows and Linux.
11
-
12
-
Open up Open-WebUI (assuming you've run `open-webui serve` and nothing else), and you should see something like the following:
If you see something similar, Open-WebUI is installed correctly! Continue on, if not, please find a workshop TA or raise your hand for some help.
17
-
18
-
Before clicking the *Getting Started* button, make sure that `ollama` has `granite3.1-dense` downloaded:
9
+
First, if you haven't already, download the Granite 3.1 model. Make sure that `ollama` is running in the background (you may have to run `ollama serve` in its own terminal depending on how you installed it) and in another terminal run the following command:
19
10
20
11
```bash
21
12
ollama pull granite3.1-dense:8b
22
13
```
23
14
24
15
!!! note
25
-
The download may take a few minutes depending on your internet connection. In the meantime, you can check out information about model we're using [here](https://ollama.com/library/granite3.1-dense). Check out how many languages it supports and take note of its capabilities. It'll help you decide what tasks you might want to use it for.
16
+
The download may take a few minutes depending on your internet connection. In the meantime, you can check out information about model we're using [here](https://ollama.com/library/granite3.1-dense). Check out how many languages it supports and take note of its capabilities. It'll help you decide what tasks you might want to use it for in the future.
17
+
18
+
Open up Open-WebUI (assuming you've run `open-webui serve`):
If you see something similar, Open-WebUI is installed correctly! Continue on, if not, please find a workshop TA or raise your hand for some help.
26
23
27
24
Click *Getting Started*. Fill out the next screen and click the *Create Admin Account*. This will be your login for your local machine. Remember that this because it will be your Open-WebUI configuration login information if want to dig deeper into it after this workshop.
28
25
@@ -41,4 +38,4 @@ The first response may take a minute to process. This is because `ollama` is spi
41
38
42
39
You may notice that your answer is slighty different then the screen shot above. This is expected and nothing to worry about!
43
40
44
-
**Congratulations!** Now you have Open-WebUI running and it's configured to work with `granite3.1-dense` and `ollama`. Have a quick chat with your model before moving on to the next lab!
41
+
**Congratulations!** Now you have Open-WebUI running and it's configured to work with `granite3.1-dense` and `ollama`. Move on to the next lab and have a chat with your model!
Copy file name to clipboardExpand all lines: docs/lab-1/README.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ description: Set up AnythingLLM to start using an LLM locally
4
4
logo: images/ibm-blue-background.png
5
5
---
6
6
7
-
With [AnythingLLM installed](../pre-work/README.md#anythingllm), open the desktop application to configure it with `ollama`. The following screenshots are taken from a Mac, but this should be similar on Windows and Linux.
7
+
Let's start by configuring [AnythingLLM installed](../pre-work/README.md#anythingllm) and `ollama` to talk to one another. The following screenshots will be from a Mac, but this should be similar on Windows and Linux.
8
8
9
9
First, if you haven't already, download the Granite 3.1 model. Make sure that `ollama` is running in the background (you may have to run `ollama serve` in its own terminal depending on how you installed it) and in another terminal run the following command:
10
10
@@ -15,7 +15,7 @@ ollama pull granite3.1-dense:8b
15
15
!!! note
16
16
The download may take a few minutes depending on your internet connection. In the meantime, you can check out information about model we're using [here](https://ollama.com/library/granite3.1-dense). Check out how many languages it supports and take note of its capabilities. It'll help you decide what tasks you might want to use it for in the future.
17
17
18
-
Either click on the *Get Started* button or open up settings (the 🔧 button). For now, we are going to configure the global settings for `ollama` but you can always change it in the future.
18
+
Open the AnythingLLM desktop application and either click on the *Get Started* button or open up settings (the 🔧 button). For now, we are going to configure the global settings for `ollama` but you can always change it in the future.
0 commit comments