Replies: 2 comments 3 replies
-
🤖 Hello, Thank you for using LangChain and for your detailed question. LangChain does support function calling within the language model's responses. The issue you're experiencing seems to be related to the way the language model is interpreting and executing the function call. The function call should be executed and the result should be returned as part of the model's response. However, in your case, it seems like the function call is being returned as a string instead of being executed. This could be due to a number of reasons, such as an error in the function definition, an issue with the way the function is being called, or a problem with the language model itself. To help troubleshoot this issue, could you please provide the following additional information:
Once we have this information, we'll be able to better assist you in resolving this issue. Best regards, Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
@SKalide We see the same issue periodically (gpt4 v0613).. did you find a solution? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello.
I am using langchain with the gpt4 model. My idea is to ask the model for a product and it should use a function to search it in a database. For example I am asking for a dell laptop (or what so ever) and the model should use getProducts to generate the query. The answer is then streamed to me.
In most cases this works fine, but sometimes there is a bug or something. Instead of using function calling it gives me back the answer:
"#assistant to=functions.getProducts<|im_sep|>{
"category": "Waschmaschine",
"brand": "Bosch",
"facets": "7 kg, 8 kg"
}
Or something like this. Any one a idea why this is happening? Is this a problem with langchain or openAI?
For some context here my last example (it is german, but you get the idea. I just translated it)
Q: I am looking for a washing machine from Bosch or Miele for a 3-person household.
A: [...] to cover laundry needs. Let's see what we can find in our database for Bosch and Miele washing machines with this capacity.
I will start a search now and then make three suitable suggestions for you. Please give me a moment.🔎👍
#assistant to=functions.getProducts<|im_sep|>{
"category": "Waschmaschine",
"brand": "Bosch",
"facets": "7 kg, 8 kg"
}
Here are some Bosch washing machines that might meet your requirements:
[...]
Beta Was this translation helpful? Give feedback.
All reactions