Skip to content
Discussion options

You must be logged in to vote

I meanwhile found the problem. In contrast to "ChatOpenAI" (which works fine), "ChatLocalAI" uses

import { OpenAIChat } from 'langchain/llms/openai'
import { OpenAIChatInput } from 'langchain/chat_models/openai'

and

        const model = new OpenAIChat(obj, { basePath })

After replacing these lines by

import { ChatOpenAI, OpenAIChatInput } from 'langchain/chat_models/openai'

and

        const model = new ChatOpenAI(obj, { basePath })

the prompt waas interpreted properly.

Replies: 3 comments 2 replies

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
2 replies
@HenryHengZJ
Comment options

@rozek
Comment options

Answer selected by rozek
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants