Skip to content

Commit 4444c85

Browse files
authored
Merge pull request #34 from rafvasq/fix-typos
Copy edit typos
2 parents 195c8a3 + db80cd2 commit 4444c85

File tree

7 files changed

+11
-11
lines changed

7 files changed

+11
-11
lines changed

docs/lab-1.5/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ The first response may take a minute to process. This is because `ollama` is spi
3939

4040
![batman](../images/openwebui_who_is_batman.png)
4141

42-
You may notice that your answer is slighty different then the screen shot above. This is expected and nothing to worry about!
42+
You may notice that your answer is slightly different then the screen shot above. This is expected and nothing to worry about!
4343

4444
## Conclusion
4545

docs/lab-1/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ The first response may take a minute to process. This is because `ollama` is spi
4545

4646
![who is batman](../images/anythingllm_who_is_batman.png)
4747

48-
You may notice that your answer is slighty different then the screen shot above. This is expected and nothing to worry about!
48+
You may notice that your answer is slightly different then the screen shot above. This is expected and nothing to worry about!
4949

5050
## Conclusion
5151

docs/lab-2/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ First, use ollama to list the models that you currently have downloaded:
2929
```
3030
ollama list
3131
```
32-
And you'll see a list similiar to the following:
32+
And you'll see a list similar to the following:
3333
```
3434
ollama list
3535
NAME ID SIZE MODIFIED

docs/lab-3/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -90,7 +90,7 @@ How would you respond to client who has had their freight lost as a representati
9090

9191
![lost freight](../images/anythingllm_lost_freight.png)
9292

93-
That's not a satisfactory or interesting response, right? We need to interate on it, and provide more context about the client, like what they may have lost. **Tip: always think about adding more context!**
93+
That's not a satisfactory or interesting response, right? We need to iterate on it, and provide more context about the client, like what they may have lost. **Tip: always think about adding more context!**
9494

9595
```
9696
The freight they lost was an industrial refrigerator, from Burbank, California to Kanas City, MO.

docs/lab-5/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,10 +6,10 @@ logo: images/ibm-blue-background.png
66

77
## Configuration and Sanity Check
88

9-
Open up AnyThingLLM, and you should see something like the following:
9+
Open up AnythingLLM, and you should see something like the following:
1010
![default screen](../images/anythingllm_open_screen.png)
1111

12-
If you see this that means AnythingLLM is installed correctly, and we can continue configuration, if not, please find a workshop TA or
12+
If you see this that means AnythingLLM is installed correctly, and we can continue configuration. If not, please find a workshop TA or
1313
raise your hand we'll be there to help you ASAP.
1414

1515
Next as a sanity check, run the following command to confirm you have the [granite4:micro](https://ollama.com/library/granite4)

docs/lab-6/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ description: Learn how to build a simple local RAG
44
logo: images/ibm-blue-background.png
55
---
66

7-
## Retrieval-Augmented Generation overview
7+
## Retrieval-Augmented Generation Overview
88
The LLMs we're using for these labs have been trained on billions of parameters, but they haven't been trained on everything, and the smaller models have less general knowledge to work with.
99
For example, even the latest models are trained with aged data, and they couldn't know about current events or the unique data your use-case might need.
1010

@@ -30,7 +30,7 @@ ollama pull granite3.3:2b
3030

3131
If you didn't know, the supported languages with `granite3.3:2b` now include:
3232

33-
- English, German, Spanish, French, Japanese, Portuguese, Arabic, Czech, Italian, Korean, Dutch, and Chinese. However, users may finetune this Granite model for languages beyond these 12 languages.
33+
- English, German, Spanish, French, Japanese, Portuguese, Arabic, Czech, Italian, Korean, Dutch, and Chinese. However, users may fine-tune this Granite model for languages beyond these 12 languages.
3434

3535
And the Capabilities also include:
3636

@@ -60,7 +60,7 @@ For example:
6060

6161
At first glance, the list looks pretty good. But if you know your IBM CEOs, you'll notice that it misses a few of them, and sometimes adds new names that weren't ever IBM CEOs!
6262
(Note: the larger granite3.3:8b does a much better job on the IBM CEOs, you can try it later)
63-
But we can provide the small LLM with a RAG document that supplements the model's missing informaiton with a correct list, so it will generate a better answer.
63+
But we can provide the small LLM with a RAG document that supplements the model's missing information with a correct list, so it will generate a better answer.
6464

6565
Click on the "New Chat" icon to clear the context. Then download a small text file with the correct list of IBM CEOs to your Downloads folder:
6666

docs/lab-7/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@ and brittle prompts with structured, maintainable, robust, and efficient AI work
5151
* Easily integrate the power of LLMs into legacy code-bases (mify).
5252
* Sketch applications by writing specifications and letting `mellea` fill in
5353
the details (generative slots).
54-
* Get started by decomposing your large unwieldy prompts into structured and maintainable mellea problems.
54+
* Get started by decomposing your large unwieldy prompts into structured and maintainable Mellea problems.
5555

5656
## Let's setup Mellea to work locally
5757

@@ -119,7 +119,7 @@ With this more advance example we now have the ability to customize the email to
119119
personalized for the recipient. But this is just a more programmatic prompt engineering, lets see where
120120
Mellea really shines.
121121

122-
### Simple email with boundries and requirements
122+
### Simple email with boundaries and requirements
123123

124124
1. The first step with the power of Mellea, is adding requirements to something like this email, take a look at this first
125125
example:

0 commit comments

Comments
 (0)