Replies: 1 comment 1 reply
-
🤖 Hello kj-itk, Based on the code you've shared, it seems like you're not setting the Here's how you can modify the code: const model = new ChatOpenAI({
temperature: 1,
maxTokens: 500, // Increase the maxTokens value here
azureOpenAIApiKey: OPENAI_API_KEY,
azureOpenAIApiVersion: OPENAI_API_VERSION,
azureOpenAIApiDeploymentName: OPENAI_AZURE_DEPLOYMENT_NAME,
azureOpenAIApiInstanceName: OPENAI_AZURE_INSTANCE_NAME
}); In this updated code, the I found a similar issue in the langchainjs repository: How can i get longer responses?. However, it was closed without any comments or solution. I hope this helps! If you have any other questions, feel free to ask. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone.
I got a sveltekit app with langchain and ai vercel and it works ok but the response is short like 300 tokens. I tried to add maxTokens to ChatOpenAI. The response is the same. Anyone faced the same issue?
Beta Was this translation helpful? Give feedback.
All reactions