-
Hi all, I have a very simple flow with a "ChatLocalAI" LLM, a "Chat Prompt Template" and an "LLM Chain". My problem is that, although I have entered both a "System Message" and a "Human Message", the LLM server (LM Studio in my case) only gets a (strange) user message: "messages": [
{
"role": "user",
"content": "[{\"lc\":1,\"type\":\"constructor\",\"id\":[\"langchain\",\"schema\",\"SystemMessage\"],\"kwargs\":{\"content\":\"Perform the instructions to the best of your ability.\",\"additional_kwargs\":{}}},{\"lc\":1,\"type\":\"constructor\",\"id\":[\"langchain\",\"schema\",\"HumanMessage\"],\"kwargs\":{\"content\":\"### Instruction: who was Joseph Weizenbaum?\\n### Response:\",\"additional_kwargs\":{}}}]"
}
] Is there s.th. I'll have to configure or add in order to get a "proper" LLM request? |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 2 replies
-
well, what I found out so far: it seems to be a problem with "ChatLocalAI" rather than "Chat Prompt Template" as everything seems to work fine when using the "ChatOpenAI" LLM with an appropriate "BasePath" |
Beta Was this translation helpful? Give feedback.
-
I meanwhile found the problem. In contrast to "ChatOpenAI" (which works fine), "ChatLocalAI" uses import { OpenAIChat } from 'langchain/llms/openai'
import { OpenAIChatInput } from 'langchain/chat_models/openai' and const model = new OpenAIChat(obj, { basePath }) After replacing these lines by import { ChatOpenAI, OpenAIChatInput } from 'langchain/chat_models/openai' and const model = new ChatOpenAI(obj, { basePath }) the prompt waas interpreted properly. |
Beta Was this translation helpful? Give feedback.
-
Oops, you should also replace this.baseClasses = [this.type, 'BaseChatModel', ...getBaseClasses(OpenAIChat)] by this.baseClasses = [this.type, ...getBaseClasses(ChatOpenAI)] |
Beta Was this translation helpful? Give feedback.
I meanwhile found the problem. In contrast to "ChatOpenAI" (which works fine), "ChatLocalAI" uses
and
After replacing these lines by
and
the prompt waas interpreted properly.