Skip to content

Commit 2d620f3

Browse files
committed
docs(getting started): recommend Instruct models
1 parent b34f097 commit 2d620f3

File tree

3 files changed

+5
-3
lines changed

3 files changed

+5
-3
lines changed

docs/guide/choosing-a-model.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -83,7 +83,7 @@ npx --no node-llama-cpp inspect estimate <model-file-url>
8383
```
8484
:::
8585

86-
### What do you need this model for? (chat, code completion, analyzing data, classification, etc.)
86+
### What do you need this model for? (chat, code completion, analyzing data, classification, etc.) {#model-purpose}
8787
There are plenty of models with different areas of expertise and capabilities.
8888

8989
When you choose a model that is more specialized in the task you need it for, it will usually perform better than a general model.

docs/guide/downloading-models.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -126,7 +126,7 @@ or the [`resolveModelFile`](../api/functions/resolveModelFile.md) method will au
126126
Alternatively, you can use the token in the [`tokens`](../api/type-aliases/ModelDownloaderOptions.md#tokens) option when using [`createModelDownloader`](../api/functions/createModelDownloader.md) or [`resolveModelFile`](../api/functions/resolveModelFile.md).
127127

128128
## Inspecting Remote Models
129-
You can inspect the metadata of a remote model without downloading it by either using the [`inspect gguf` command](../cli/inspect/gguf.md) with a URL,
129+
You can inspect the metadata of a remote model without downloading it by either using the [`inspect gguf`](../cli/inspect/gguf.md) command with a URL,
130130
or using the [`readGgufFileInfo`](../api/functions/readGgufFileInfo.md) method with a URL:
131131
```typescript
132132
import {readGgufFileInfo} from "node-llama-cpp";
@@ -140,7 +140,7 @@ const modelMetadata = await readGgufFileInfo("<model url>");
140140
It's handy to check the compatibility of a remote model with your current machine hardware before downloading it,
141141
so you won't waste time downloading a model that won't work on your machine.
142142

143-
You can do so using the [`inspect estimate` command](../cli/inspect/estimate.md) with a URL:
143+
You can do so using the [`inspect estimate`](../cli/inspect/estimate.md) command with a URL:
144144
```shell
145145
npx --no node-llama-cpp inspect estimate <model-url>
146146
```

docs/guide/index.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -55,6 +55,8 @@ We recommend getting a GGUF model from either [Michael Radermacher on Hugging Fa
5555

5656
We recommend starting by getting a small model that doesn't have a lot of parameters just to ensure everything works, so try downloading a `7B`/`8B` parameters model first (search for models with both `7B`/`8B` and `GGUF` in their name).
5757

58+
To ensure you can chat with the model, make sure you [choose an Instruct model](./choosing-a-model.md#model-purpose) by looking for `Instruct` or `it` in the model name.
59+
5860
For improved download speeds, you can use the [`pull`](../cli/pull.md) command to download a model:
5961
```shell
6062
npx --no node-llama-cpp pull --dir ./models <model-file-url>

0 commit comments

Comments
 (0)