Table of Contents
llama-cloak is a simple gui to interact with your local Llama instance.
This is a react-router v7
app. To run this locally, you just need npm
and llama3.2
.
This is an example of how to list things you need to use the software and how to install them.
- run llama locally
ollama run llama3.2
- Clone the repo
git clone https://github.com/the-wc/llama-cloak.git
- Install NPM packages
npm install
- Start using
llama-cloak
at http://localhost:5173.
Type a message into the chat interface and you'll get a response.
I threw this together over a couple hours, so you should just clone/fork it and make your own changes. Some ideas:
- Handle optimistic UI with a prompt
- Store chats uniquely
- GPU temp/mem monitor
- Online status ping for checking local llama instance
- Configuration for advanced use cases
- Agentic/MCP support
See the open issues for a full list of proposed features (and known issues).
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
) - Commit your Changes (
git commit -m 'Add some AmazingFeature'
) - Push to the Branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
Distributed under the MIT License. See LICENSE.txt
for more information.
Will - @dubkaycee
Project Link: https://github.com/the-wc/llama-cloak