-
|
the documentation an wraper is very hard to understand, lack of examples: https://withcatai.github.io/node-llama-cpp/guide/chat-prompt-wrapper for example wizardCoder: how to extend this class? the example didn't show how to set systemPrompt and also how to write wraper for codefusion |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
|
You can pass a custom Regarding your import {fileURLToPath} from "url";
import path from "path";
import {
LlamaModel, LlamaContext, LlamaChatSession, GeneralChatPromptWrapper
} from "node-llama-cpp";
const __dirname = path.dirname(fileURLToPath(import.meta.url));
const model = new LlamaModel({
modelPath: path.join(__dirname, "models", "codellama-13b.Q3_K_M.gguf"),
promptWrapper: new GeneralChatPromptWrapper({
instructionName: "Instruction",
responseName: "Response"
})
});
const context = new LlamaContext({model});
const session = new LlamaChatSession({
context,
systemPrompt: "Below is an instruction that describes a task. Write a response that appropriately completes the request."
});
const q1 = "Hi there, how are you?";
console.log("User: " + q1);
const a1 = await session.prompt(q1);
console.log("AI: " + a1);
const q2 = "Summerize what you said";
console.log("User: " + q2);
const a2 = await session.prompt(q2);
console.log("AI: " + a2);Based on this example, you can read the source code of You're also welcome to open a PR with a |
Beta Was this translation helpful? Give feedback.
-
|
thank you for the answer! |
Beta Was this translation helpful? Give feedback.
You can pass a custom
systemPromptwhen you create aLlamaChatSessionobject.Regarding your
wizardCoderexample, you can just customize theGeneralChatPromptWrapperwith the role names you mentioned: