Skip to content

Commit 3fd4847

Browse files
committed
Oct 9th updates, 0.19 updates.
Updates :) Signed-off-by: JJ Asghar <[email protected]>
1 parent a7c08ef commit 3fd4847

File tree

5 files changed

+61
-38
lines changed

5 files changed

+61
-38
lines changed

docs/README.md

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -15,14 +15,13 @@ locally to help it gain some knowledge.
1515
* How to use an AI model that is built in a verifiable and legal way
1616
* Learn a suggested workflow on how to fine tune a model with new knowledge and skills!
1717

18-
### About this workshop
18+
## About this workshop
1919

2020
The introductory page of the workshop is broken down into the following sections:
2121

22-
* [Agenda](./#agenda)
23-
* [Compatibility](./#compatibility)
24-
* [Technology Used](./#technology-used)
25-
* [Credits](./#credits)
22+
* [Agenda](#agenda)
23+
* [Compatibility](#compatibility)
24+
* [Credits](#credits)
2625

2726
## Agenda
2827

@@ -39,7 +38,6 @@ The introductory page of the workshop is broken down into the following sections
3938
This workshop has been tested on the following platforms:
4039

4140
* **MacOS**: version 14.5
42-
* **Windows**: version 11
4341
* **Linux**: Fedora 40
4442

4543
## Credits

docs/lab-1/README.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
Success! We're ready to start with the first steps on your AI journey with us today.
66
With this first lab, we'll be working through the steps in this [blogpost using Granite as a code assistant](https://developer.ibm.com/tutorials/awb-local-ai-copilot-ibm-granite-code-ollama-continue/).
77

8-
In this tutorial, I will show how to use a collection of open-source components to run a feature-rich developer code assistant in Visual Studio Code while addressing data privacy, licensing, and cost challenges that are common to enterprise users. The setup is powered by local large language models (LLMs) with IBM's open-source LLM family, [Granite Code](https://github.com/ibm-granite/granite-code-models). All components run on a developer's workstation and have business-friendly licensing.
8+
In this tutorial, we will show how to use a collection of open-source components to run a feature-rich developer code assistant in Visual Studio Code while addressing data privacy, licensing, and cost challenges that are common to enterprise users. The setup is powered by local large language models (LLMs) with IBM's open-source LLM family, [Granite Code](https://github.com/ibm-granite/granite-code-models). All components run on a developer's workstation and have business-friendly licensing.
99

1010
There are three main barriers to adopting these tools in an enterprise setting:
1111

@@ -17,13 +17,14 @@ There are three main barriers to adopting these tools in an enterprise setting:
1717

1818
Why did we select Granite as the LLM of choice for this exercise?
1919

20-
Granite Code was produced by IBM Research, with the goal of building an LLM that had only seen code which used enterprise-friendly licenses. According to section 2 of the Granite Code paper ([Granite Code Models: A Family of Open Foundation Models for Code Intelligence][paper]),the IBM Granite Code models meticulously curated their training data for licenses, and to make sure that all text did not contain any hate, abuse, or profanity.
20+
Granite Code was produced by IBM Research, with the goal of building an LLM that had only seen code which used enterprise-friendly licenses. According to section 2 of the Granite Code paper ([Granite Code Models: A Family of Open Foundation Models for Code Intelligence][paper]), the IBM Granite Code models meticulously curated their training data for licenses, and to make sure that all text did not contain any hate, abuse, or profanity.
2121

2222
Many open LLMs available today license the model itself for derivative work, but because they bring in large amounts of training data without discriminating by license, most companies can't use the output of those models since it potentially presents intellectual property concerns. Granite
2323

24-
Granite Code comes in a wide range of sizes to fit your workstation's available resources. Generally, the bigger the model, the better the results, with a tradeoff: model responses will be slower, and it will take up more resources on your machine. I chose the 20b option as my starting point for chat and the 8b option for code generation. Ollama offers a convenient pull feature to download models:
24+
Granite Code comes in a wide range of sizes to fit your workstation's available resources. Generally, the bigger the model, the better the results, with a tradeoff: model responses will be slower, and it will take up more resources on your machine. We chose the 20b option as my starting point for chat and the 8b option for code generation. Ollama offers a convenient pull feature to download models:
2525

2626
Open up your terminal, and run the following commands:
27+
2728
```bash
2829
ollama pull granite-code:20b
2930
ollama pull granite-code:8b

docs/lab-2/README.md

Lines changed: 20 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -5,15 +5,17 @@ Lets play with our new found local AI Open Source AI!
55
## Sanity checks
66

77
When you open up `continue` inside of VSCode it should look something like:
8-
![](https://docs.continue.dev/assets/images/understand-ca0edc3d06922dd4a95e31fa06f999ec.gif)
8+
![](https://docs.continue.dev/assets/images/move-to-right-sidebar-b2d315296198e41046fc174d8178f30a.gif)
99

1010
Before we go any farther, write in "Who is batman?" to verify that `ollama`,
1111
VSCode, and `continue` are all working correctly.
1212

1313
!!! troubleshooting
1414
If Continue is taking a long time to respond, restart Visual Studio Code. If that doesn't resolve your issue, restart Ollama.
1515

16-
If you would like to go deeper with continue, take a look at the [official Continue.dev how-to guide]( https://docs.continue.dev/how-to-use-continue).
16+
If you would like to go deeper with `continue`, take a look at the [official Continue.dev how-to guide](https://docs.continue.dev/how-to-use-continue).
17+
Its worth taken the moment if you want, otherwise, when you get home and try this on your own
18+
hardware, it's awesome to see what `continue` can do.
1719

1820
Now that we have our local AI co-pilot with us, let's start using it. Now these
1921
next examples are going to be focused on `python` but there is nothing stopping
@@ -39,6 +41,10 @@ Now use the `command-i` or `ctrl-i` to open up the `generate code` command palet
3941
write me out conways game of life using pygame
4042
```
4143

44+
!!! note
45+
If you don't know what Conway's Game of Life is, take a look [here](https://en.wikipedia.org/wiki/Conway's_Game_of_Life) or
46+
raise your hand, I'm betting the TA's would love to talk to you about it. 😁
47+
4248
Now granite-code should start giving you a good suggestion here, it should look something like:
4349
![gameoflife_v1](../images/gameoflife_v1.png)
4450

@@ -59,17 +65,22 @@ Don't believe me? Bring up the terminal and attempt to run this code after you a
5965
![nope doesn't do anything](../images/nowork.png)
6066

6167
Well that isn't good is it? Yours may be different code, or maybe it does work, but at least in this
62-
example we need to to get it fixed.
68+
example we need to to get the code fixed.
6369

6470
## First pass at debugging
6571

6672
I'll run the following commands to build up an virtual environment, and install some modules, lets
6773
see how far we get.
6874

75+
!!! tip
76+
If these next commands are foreign to you, it's ok. These are `python` commands, and you can just
77+
copy paste it in. If you'd like to know more or _why_ it is, raise your hand a TA should be able
78+
to explain it to you.
79+
6980
```bash
7081
python3.11 -m venv venv
7182
source venv/bin/activate
72-
workshop-python-ai pip install pygame
83+
pip install pygame
7384
```
7485

7586
Well better, I think, but nothing still happens. So even noticing the `import pygame` tells me I need to
@@ -79,7 +90,10 @@ so it's more readable.
7990
## Cleaning up the AI generated code
8091

8192
!!! note
82-
You can try using the built-in autocomplete and code assistant functions to generate any missing code. In our example, we're missing a "main" entry point to the script. Try hitting `cmd/ctrl + I` again, and typing in something like: "write a main function for my game that plays twenty rounds of Conway's game of life using the `board()` function". What happens?
93+
You can try using the built-in autocomplete and code assistant functions to generate any missing code.
94+
In our example, we're missing a "main" entry point to the script. Try hitting `cmd/ctrl + I` again,
95+
and typing in something like: "write a main function for my game that plays twenty rounds of Conway's
96+
game of life using the `board()` function." What happens?
8397

8498
Cleaning up the code. Now everything is smashed together, it's hard to read the logic here, so first
8599
thing first, going to break up the code and add a `def main` function so I know what the entry point is.
@@ -126,7 +140,7 @@ function so I get a better understanding of what the program is doing.
126140
Go ahead and walk through your version, see the logic, and who knows maybe it'll highlight why yours
127141
isn't working yet, or not, the next step will help you even more!
128142

129-
## Automagicly creating tests!
143+
## Automagicly creating tests
130144

131145
One of the most powerful/helping stream line your workflow as a developer is writing good tests
132146
around your code. It's a safety blanket to help make sure your custom code has a way to check for

docs/lab-3/README.md

Lines changed: 24 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,11 @@ logo: images/ilab_dog.png
66

77
# Getting Started with InstructLab
88

9+
!!! tip
10+
We are jumping in the deep end here, don't hesitate to raise your hand and ask questions.
11+
We also assume you have a good _foundational_ (heh) knowledege of `python` if not, ask
12+
one of the TA's to run through this on the overhead screen.
13+
914
Now that we have a working VSCode instance working with the granite model, lets talk about more
1015
things you can do with an Open Source AI system. This is where the [InstructLab](https://instructlab.ai/)
1116
project comes into play.
@@ -25,12 +30,12 @@ it to have.
2530
These steps will pull down a premade `qna.yaml` so you can do a local build. Skip the `wget`, `mv`, and `ilab taxonomy diff` if you don't want to do this.
2631

2732
```bash
28-
python3.11 -m venv venv-instructlab-0.18-3.11
29-
source venv-instructlab-0.18-3.11/bin/activate
33+
python3.11 -m venv venv-instructlab
34+
source venv-instructlab/bin/activate
3035
pip install 'instructlab[mps]'
3136
which ilab
3237
ilab config init
33-
cd ~/Library/Application\ Support/instructlab/
38+
cd ~/.config/instructlab/
3439
mkdir -p taxonomy/knowledge/astronomy/constellations/Phoenix/
3540
wget https://raw.githubusercontent.com/instructlab/taxonomy/26b3fe21ccbb95adc06fe8ce76c7c18559e8dd05/knowledge/science/astronomy/constellations/phoenix/qna.yaml
3641
mv qna.yaml taxonomy/knowledge/astronomy/constellations/Phoenix/
@@ -210,22 +215,18 @@ Path to taxonomy repo [taxonomy]: <ENTER>
210215
211216
```shell
212217
(venv) $ ilab config init
213-
Welcome to InstructLab CLI. This guide will help you set up your environment.
218+
Welcome to InstructLab CLI. This guide will help you to setup your environment.
214219
Please provide the following values to initiate the environment [press Enter for defaults]:
215-
Path to taxonomy repo [taxonomy]: <ENTER>
216-
`taxonomy` seems to not exists or is empty. Should I clone https://github.com/instructlab/taxonomy.git for you? [y/N]: y
220+
Path to taxonomy repo [/Users/USERNAME/.local/share/instructlab/taxonomy]:
221+
`/Users/USERNAME/.local/share/instructlab/taxonomy` seems to not exist or is empty. Should I clone https://github.com/instructlab/taxonomy.git for you? [Y/n]: y
217222
Cloning https://github.com/instructlab/taxonomy.git...
218-
Path to your model [/Users/USERNAME/Library/Caches/instructlab/models/merlinite-7b-lab-Q4_K_M.gguf]: <ENTER>
223+
Path to your model [/Users/USERNAME/.cache/instructlab/models/merlinite-7b-lab-Q4_K_M.gguf]: y
224+
Generating `/Users/USERNAME/.config/instructlab/config.yaml`...
219225
```
220226
221227
5) When prompted, please choose a train profile. Train profiles are GPU specific profiles that enable accelerated training behavior. **YOU ARE ON MacOS**, please choose `No Profile (CPU-Only)` by hitting Enter. There are various flags you can utilize with individual `ilab` commands that will allow you to utilize your GPU if applicable.
222228
223229
```shell
224-
Welcome to InstructLab CLI. This guide will help you to setup your environment.
225-
Please provide the following values to initiate the environment [press Enter for defaults]:
226-
Path to taxonomy repo [~/Library/Application\ Support/instructlab/taxonomy]:
227-
Path to your model [/Users/USERNAME/Library/Caches//instructlab/models/merlinite-7b-lab-Q4_K_M.gguf]:
228-
Generating `~/Library/Application\ Support/instructlab/config.yaml`...
229230
Please choose a train profile to use:
230231
[0] No profile (CPU-only)
231232
[1] A100_H100_x2.yaml
@@ -243,22 +244,22 @@ Path to taxonomy repo [taxonomy]: <ENTER>
243244

244245
### `ilab` directory layout after initializing your system
245246

246-
After running `ilab config init` your directories will look like the following on a Linux system:
247+
After running `ilab config init` your directories will look like the following on a MacOS system:
247248

248249
```shell
249-
├─ ~/Library/Application\ Support/instructlab/models/ (1)
250-
├─ ~/Library/Application\ Support/instructlab/datasets (2)
251-
├─ ~/Library/Application\ Support/instructlab/taxonomy (3)
252-
├─ ~/Library/Application\ Support/instructlab/checkpoints (4)
250+
├─ ~/.config/instructlab/models/ (1)
251+
├─ ~/.config/instructlab/datasets (2)
252+
├─ ~/.config/instructlab/taxonomy (3)
253+
├─ ~/.config/instructlab/checkpoints (4)
253254
```
254255

255-
1) `/Users/USERNAME/Library/Caches/instructlab/models/`: Contains all downloaded large language models, including the saved output of ones you generate with ilab.
256+
1) `/Users/USERNAME/.config/instructlab/models/`: Contains all downloaded large language models, including the saved output of ones you generate with ilab.
256257

257-
2) `~/Library/Application\ Support/instructlab/datasets/`: Contains data output from the SDG phase, built on modifications to the taxonomy repository.
258+
2) `~/.config/instructlab/datasets/`: Contains data output from the SDG phase, built on modifications to the taxonomy repository.
258259

259-
3) `~/Library/Application\ Support/instructlab/taxonomy/`: Contains the skill and knowledge data.
260+
3) `~/.config/instructlab/taxonomy/`: Contains the skill and knowledge data.
260261

261-
4) `~/Users/USERNAME/Library/Caches/instructlab/checkpoints/`: Contains the output of the training process
262+
4) `~/Users/USERNAME/.config/instructlab/checkpoints/`: Contains the output of the training process
262263

263264
### 📥 Download the model
264265

@@ -272,9 +273,9 @@ ilab model download
272273

273274
```shell
274275
(venv) $ ilab model download
275-
Downloading model from Hugging Face: instructlab/merlinite-7b-lab-GGUF@main to /Users/USERNAME/Library/Caches/instructlab/models...
276+
Downloading model from Hugging Face: instructlab/merlinite-7b-lab-GGUF@main to /Users/USERNAME/.config/instructlab/models...
276277
...
277-
INFO 2024-08-01 15:05:48,464 huggingface_hub.file_download:1893: Download complete. Moving file to /Users/USERNAME/Library/Caches/instructlab/models/merlinite-7b-lab-Q4_K_M.gguf
278+
INFO 2024-08-01 15:05:48,464 huggingface_hub.file_download:1893: Download complete. Moving file to /Users/USERNAME/.config/instructlab/models/merlinite-7b-lab-Q4_K_M.gguf
278279
```
279280

280281
!!! note

docs/pre-work/README.md

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -73,6 +73,15 @@ python --version
7373
Python 3.11.4
7474
```
7575

76+
#### Mac installation steps
77+
78+
##### Terminal Installation
79+
80+
If you need to install Python via `brew` please do the following:
81+
```bash
82+
brew install [email protected]
83+
```
84+
7685
Please confirm that your `python --version` is at least `3.11+` for the best experience.
7786

7887
With this you should have the applications you need, let's start the workshop!

0 commit comments

Comments
 (0)