@@ -9,9 +9,9 @@ You can [save and restore a chat history](./chat-session.md#save-and-restore) on
99:::
1010
1111To interact with a model in a chat form, you can use [ ` LlamaChatSession ` ] ( ../api/classes/LlamaChatSession.md ) ,
12- which is stateful chat session that manages the chat state on its own.
12+ which is a stateful chat session that manages the chat state on its own.
1313
14- When building a library around ` node-llama-cpp ` , you may want to store that chat state externally and control the evaluations on your own .
14+ When building a library around ` node-llama-cpp ` , you may want to store that chat state externally and control the evaluations yourself .
1515
1616This is where [ ` LlamaChat ` ] ( ../api/classes/LlamaChat.md ) may come in handy.
1717[ ` LlamaChat ` ] ( ../api/classes/LlamaChat.md ) Allows you to generate a completion to an existing chat session and manage the evaluation yourself,
@@ -69,9 +69,9 @@ const res = await llamaChat.generateResponse(chatHistory, {
6969console .log (" AI: " + res .response );
7070```
7171
72- Now, let say we want to ask the model a follow-up question based on the previous response.
72+ Now, let's say we want to ask the model a follow-up question based on the previous response.
7373Since we already have a context sequence loaded with the previous chat history,
74- we'd want to use it as much a possible.
74+ we'd want to reuse it as much a possible.
7575
7676To do so, we pass the context window of the previous evaluation output to the new evaluation.
7777This is important, since if a context shift has happened, we want to use the existing post-context-shift context sequence state
0 commit comments