Skip to content

Commit b9d91b3

Browse files
committed
docs: typos
1 parent f7247ae commit b9d91b3

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

docs/guide/external-chat-state.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -9,9 +9,9 @@ You can [save and restore a chat history](./chat-session.md#save-and-restore) on
99
:::
1010

1111
To interact with a model in a chat form, you can use [`LlamaChatSession`](../api/classes/LlamaChatSession.md),
12-
which is stateful chat session that manages the chat state on its own.
12+
which is a stateful chat session that manages the chat state on its own.
1313

14-
When building a library around `node-llama-cpp`, you may want to store that chat state externally and control the evaluations on your own.
14+
When building a library around `node-llama-cpp`, you may want to store that chat state externally and control the evaluations yourself.
1515

1616
This is where [`LlamaChat`](../api/classes/LlamaChat.md) may come in handy.
1717
[`LlamaChat`](../api/classes/LlamaChat.md) Allows you to generate a completion to an existing chat session and manage the evaluation yourself,
@@ -69,9 +69,9 @@ const res = await llamaChat.generateResponse(chatHistory, {
6969
console.log("AI: " + res.response);
7070
```
7171

72-
Now, let say we want to ask the model a follow-up question based on the previous response.
72+
Now, let's say we want to ask the model a follow-up question based on the previous response.
7373
Since we already have a context sequence loaded with the previous chat history,
74-
we'd want to use it as much a possible.
74+
we'd want to reuse it as much a possible.
7575

7676
To do so, we pass the context window of the previous evaluation output to the new evaluation.
7777
This is important, since if a context shift has happened, we want to use the existing post-context-shift context sequence state

0 commit comments

Comments
 (0)