Replies: 3 comments
-
Hey there, @gbarton! Great to see you back with another intriguing challenge for us. Thank you for providing the detailed information. The error message "Unsupported message type for Ollama: function" suggests that the Ollama Functions model is receiving a message type that it doesn't support. From the error stack trace, it seems like the error is thrown when the In your code, you're creating an However, the error message suggests that somewhere in this process, a function is being passed as a message, which is not supported. It's possible that this is happening in the Unfortunately, without more information about these classes and their methods, it's hard to say exactly what's causing the error. It might be helpful to look at the source code of these classes to see if you can identify where the function is being passed as a message. One possible solution could be to ensure that the Another possible solution could be to check the I hope this helps! If you have any other questions or need further clarification, please let me know. Sources
|
Beta Was this translation helpful? Give feedback.
-
@gbarton have you been able to get this to work ? |
Beta Was this translation helpful? Give feedback.
-
Same experience, I’m using OllamaEmbeddings. If I pass the Ollama model in createOpenAIFunctionsAgent, it does not work. But if I use the OpenAI model, it works. I tried using the OpenAI model constructor using Ollama baseUrl and apiKey. I read somewhere that I can use any api key, but unless I use a real openAI api key, I get an invalid apu key error. So I’m guessing it still accesses OpenAi endpoint instead of using ollama? any tips? Thanks |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
Am trying to merge together the agent quickstart: https://js.langchain.com/docs/modules/agents/quick_start with Ollama Functions https://js.langchain.com/docs/integrations/chat/ollama_functions to be able to run tools with Ollama. No luck so far having read #3942 I thought it be possible to make it work maybe?
Thanks in advance!
OUTPUT BELOW:
System Info
node ➜ /workspaces/mila (master) $ npm info langchain
[email protected] | MIT | deps: 18 | versions: 259
node ➜ /workspaces/mila (master) $ node --version
v20.10.0
Beta Was this translation helpful? Give feedback.
All reactions