Skip to content

Commit 327c76b

Browse files
committed
Sync anythingllm and openwebui
Signed-off-by: Rafael Vasquez <[email protected]>
1 parent 155cb74 commit 327c76b

File tree

2 files changed

+12
-15
lines changed

2 files changed

+12
-15
lines changed

docs/lab-1.5/README.md

Lines changed: 10 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -4,25 +4,22 @@ description: Set up Open-WebUI to start using an LLM locally
44
logo: images/ibm-blue-background.png
55
---
66

7-
!!! warning
8-
This is **optional**. You don't need Open-WebUI if you have AnythingLLM already running.
7+
Let's start by configuring [Open-WebUI](../pre-work/README.md#installing-open-webui) and `ollama` to talk to one another. The following screenshots will be from a Mac, but this should be similar on Windows and Linux.
98

10-
Now that you have [Open-WebUI installed](../pre-work/README.md#installing-open-webui) let's configure it with `ollama` and Open-WebUI to talk to one another. The following screenshots will be from a Mac, but the gist of this should be the same on Windows and Linux.
11-
12-
Open up Open-WebUI (assuming you've run `open-webui serve` and nothing else), and you should see something like the following:
13-
14-
![default screen](../images/openwebui_open_screen.png)
15-
16-
If you see something similar, Open-WebUI is installed correctly! Continue on, if not, please find a workshop TA or raise your hand for some help.
17-
18-
Before clicking the *Getting Started* button, make sure that `ollama` has `granite3.1-dense` downloaded:
9+
First, if you haven't already, download the Granite 3.1 model. Make sure that `ollama` is running in the background (you may have to run `ollama serve` in its own terminal depending on how you installed it) and in another terminal run the following command:
1910

2011
```bash
2112
ollama pull granite3.1-dense:8b
2213
```
2314

2415
!!! note
25-
The download may take a few minutes depending on your internet connection. In the meantime, you can check out information about model we're using [here](https://ollama.com/library/granite3.1-dense). Check out how many languages it supports and take note of its capabilities. It'll help you decide what tasks you might want to use it for.
16+
The download may take a few minutes depending on your internet connection. In the meantime, you can check out information about model we're using [here](https://ollama.com/library/granite3.1-dense). Check out how many languages it supports and take note of its capabilities. It'll help you decide what tasks you might want to use it for in the future.
17+
18+
Open up Open-WebUI (assuming you've run `open-webui serve`):
19+
20+
![default screen](../images/openwebui_open_screen.png)
21+
22+
If you see something similar, Open-WebUI is installed correctly! Continue on, if not, please find a workshop TA or raise your hand for some help.
2623

2724
Click *Getting Started*. Fill out the next screen and click the *Create Admin Account*. This will be your login for your local machine. Remember that this because it will be your Open-WebUI configuration login information if want to dig deeper into it after this workshop.
2825

@@ -41,4 +38,4 @@ The first response may take a minute to process. This is because `ollama` is spi
4138

4239
You may notice that your answer is slighty different then the screen shot above. This is expected and nothing to worry about!
4340

44-
**Congratulations!** Now you have Open-WebUI running and it's configured to work with `granite3.1-dense` and `ollama`. Have a quick chat with your model before moving on to the next lab!
41+
**Congratulations!** Now you have Open-WebUI running and it's configured to work with `granite3.1-dense` and `ollama`. Move on to the next lab and have a chat with your model!

docs/lab-1/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ description: Set up AnythingLLM to start using an LLM locally
44
logo: images/ibm-blue-background.png
55
---
66

7-
With [AnythingLLM installed](../pre-work/README.md#anythingllm), open the desktop application to configure it with `ollama`. The following screenshots are taken from a Mac, but this should be similar on Windows and Linux.
7+
Let's start by configuring [AnythingLLM installed](../pre-work/README.md#anythingllm) and `ollama` to talk to one another. The following screenshots will be from a Mac, but this should be similar on Windows and Linux.
88

99
First, if you haven't already, download the Granite 3.1 model. Make sure that `ollama` is running in the background (you may have to run `ollama serve` in its own terminal depending on how you installed it) and in another terminal run the following command:
1010

@@ -15,7 +15,7 @@ ollama pull granite3.1-dense:8b
1515
!!! note
1616
The download may take a few minutes depending on your internet connection. In the meantime, you can check out information about model we're using [here](https://ollama.com/library/granite3.1-dense). Check out how many languages it supports and take note of its capabilities. It'll help you decide what tasks you might want to use it for in the future.
1717

18-
Either click on the *Get Started* button or open up settings (the 🔧 button). For now, we are going to configure the global settings for `ollama` but you can always change it in the future.
18+
Open the AnythingLLM desktop application and either click on the *Get Started* button or open up settings (the 🔧 button). For now, we are going to configure the global settings for `ollama` but you can always change it in the future.
1919

2020
![wrench icon](../images/anythingllm_wrench_icon.png)
2121

0 commit comments

Comments
 (0)