Skip to content

Commit e42f59d

Browse files
authored
Update llm_inference.md (#215)
Signed-off-by: alabulei1 <[email protected]>
1 parent f166bad commit e42f59d

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

docs/develop/rust/wasinn/llm_inference.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -52,7 +52,7 @@ Because the example already includes a compiled WASM file from the Rust code, we
5252
First, get the latest llama-chat wasm application
5353

5454
```bash
55-
curl -LO https://github.com/second-state/LlamaEdge/releases/latest/download/llama-chat.wasm
55+
curl -LO https://github.com/LlamaEdge/LlamaEdge/releases/latest/download/llama-chat.wasm
5656
```
5757

5858
Next, let's get the model. In this example, we are going to use the llama2 7b chat model in GGUF format. You can also use other kinds of llama2 models, check out [here](https://github.com/second-state/llamaedge/blob/main/chat/README.md#get-model).
@@ -85,7 +85,7 @@ The total cost of four apples is 20 dollars.
8585
Let's build the wasm file from the rust source code. First, git clone the `llamaedge` repo.
8686

8787
```bash
88-
git clone https://github.com/second-state/llamaedge.git
88+
git clone https://github.com/LlamaEdge/LlamaEdge.git
8989
cd chat
9090
```
9191

@@ -143,7 +143,7 @@ You can configure the chat inference application through CLI options.
143143
Print help
144144
```
145145
146-
The `--prompt-template` option is perhaps the most interesting. It allows the application to support different open source LLM models beyond llama2.
146+
The `--prompt-template` option is perhaps the most interesting. It allows the application to support different open source LLM models beyond llama2. Check out more prompt templates [here](https://github.com/LlamaEdge/LlamaEdge/tree/main/api-server/chat-prompts).
147147
148148
| Template name | Model | Download |
149149
| ------------ | ------------------------------ | --- |

0 commit comments

Comments
 (0)