Ollama client using MicroPython #15307
shariltumin
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I've been experimenting with Ollama client using MicroPython for the past few days. I've created a GitHub repository called "ollama-client-micropython." You can find it here.
There you can find
ollama.py
, a client library for Ollama. I also included a short description on how to install the Ollama program on a Linux machine and how to manually start the Ollama server so that we can run tests on WiFi-enabled MCU boards.It might be possible to send sensor data to the Ollama server and obtain useful inferences from it. This could serve as a starting point for AI-IoT smart systems.
I hope it can be useful to those who are interested in generative AI technology and large language models (LLMs).
There is an interesting video titled "The Turing Lectures: The Future of Generative AI" that you might find interesting. You can watch it here.
Beta Was this translation helpful? Give feedback.
All reactions