Skip to content

ai-twinkle/little_star_app

Repository files navigation

✨ Little Star App

Flutter License

"Twinkle, twinkle, little star, how I wonder what you are..."

Little Star App is a magical AI playground designed to bring the power of Large Language Models (LLMs) directly to your device. Powered by Twinkle AI and the robust llama.cpp engine, this app lets you explore, download, and chat with AI models completely offline.

Welcome Banner

🌟 Key Features

  • 🚀 On-Device Inference: Run GGUF models locally with privacy and speed using llama.cpp.
  • 💬 AI Chat: Interact with LLMs through a user-friendly chat interface.
  • 📥 Model Manager: Integrated browser to discover and download GGUF models from Hugging Face.
  • ⚡ Performance Testing: "Completion Mode" to test model raw performance and generation speed.
  • 📱 Cross-Platform: Built with Flutter for Android, iOS, and Desktop (WIP).

📸 Screenshots

Platform Home & Models Chat Interface Model Completion
Android Home Chat Completion
iOS Home Chat Completion

🛠️ Tech Stack

🚀 Getting Started

Prerequisites

  • Flutter SDK (3.7.0 or later)
  • C++ Compiler (CMake, GCC/Clang) for building native libraries.
  • Android NDK (for Android build) / Xcode (for iOS build).

Installation

  1. Clone the repository (including submodules):
git clone --recursive https://github.com/your-org/little-star-app.git
cd little-star-app
  1. Install dependencies:
flutter pub get
  1. Build & Run:
flutter run

📚 Documentation

Check out our development notes for deep dives into the implementation:

🤝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

🙏 Acknowledgments

About

A magical AI playground designed to bring the power of Large Language Models (LLMs) directly to your device.

Resources

Stars

Watchers

Forks

Packages

 
 
 

Contributors