Using LLama3.1 with Function calling #7500
-
Now LLama3.1 supports function calling and tools, along with other tools like Ollama or Together.Ai, can we use it in semantic kernel for auto plugin calls? |
Beta Was this translation helpful? Give feedback.
Answered by
rogerbarreto
Jul 29, 2024
Replies: 1 comment 1 reply
-
Thanks for opening this issue! Firstly we are targeting to release the Ollama Connector and get it merged into |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
taha-ghadirian
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Thanks for opening this issue!
Firstly we are targeting to release the Ollama Connector and get it merged into
main
soon and after that improve it including the task below which addresses also this discussion point! 😊