Skip to content

haoxiang-xu/PuPu

Repository files navigation

PuPu

PuPu

Your AI, your way — beautifully simple.

A desktop AI client for local and cloud models, with workspace-aware chat in one clean app.

Download for Mac    Download for Windows    Download for Linux

PuPu is a cross-platform desktop AI client that lets you chat with local models through Ollama or connect to cloud providers such as OpenAI and Anthropic. It is built for people who want one fast desktop app for everyday AI work, not a browser tab maze.

If you find the project useful, ⭐⭐⭐ star the repo.

PuPu UI

Why PuPu

  • Local and cloud models in one place
    • Use Ollama for local models, or switch to supported hosted providers when you need them.
  • Workspace-aware chat
    • Attach a project folder so PuPu can work with your local files in context.
  • A cleaner desktop workflow
    • Keep conversations, settings, and tools inside one native app on macOS, Windows, and Linux.
  • Built for real usage
    • Manage multiple chats, keep context close to your work, and avoid bouncing between disconnected tools.

Get PuPu

Download the latest release:

Mac (Apple Silicon)  Mac (Intel)

Download for Windows

Linux (.AppImage)  Linux (.deb)

Windows

  1. Download the latest .exe installer:

    Download for Windows

  2. Run the installer.

  3. Launch PuPu from the Start menu.

macOS

  1. Download the latest .dmg:

    Mac (Apple Silicon)  Mac (Intel)

  2. Open the disk image.

  3. Drag PuPu into Applications.

  4. Launch PuPu from Applications.

Linux

  1. Download the latest:

    Linux (.AppImage)  Linux (.deb)

  2. If you use the .deb, install it with:

sudo apt install ./PuPu_0.1.1.deb
  1. If your system reports a Chromium sandbox permission error, run:
sudo chown root:root /opt/PuPu/chrome-sandbox
sudo chmod 4755 /opt/PuPu/chrome-sandbox

Quick Start

  1. Open PuPu.
  2. Choose how you want to run models:
    • local with Ollama
    • cloud with a supported provider such as OpenAI or Anthropic
  3. Add any API key or provider settings in the app if needed.
  4. Optionally attach a workspace folder so PuPu can work with local files in context.
  5. Start chatting.

What You Can Do

Work With Local Models

Run supported Ollama models directly from your machine without leaving the desktop app.

Connect To Hosted Providers

Switch to supported cloud providers when you want stronger hosted models or a different workflow.

Attach A Workspace

Give PuPu a workspace and keep the conversation tied to the files you are actually working on.

Keep Chats Organized

Manage multiple conversations without losing context or cluttering your workflow.

Screenshots

PuPu showcase

Roadmap

  • Agent Builder
  • Agent Teams and Skills
  • MCP integration

Contributing

Contributions are welcome.

By intentionally submitting a contribution, you agree to the terms in docs/CLA.md. In short:

  • you keep ownership of your contribution;
  • the project may ship your contribution under Apache-2.0; and
  • the project may also reuse or relicense accepted contributions in future commercial, dual-licensed, source-available, or proprietary offerings.

If you are contributing code or assets owned by your employer or client, make sure you have authority to do so before opening a pull request.

License And Trademark

PuPu is distributed under the Apache License 2.0.

This means the code can be used, modified, and redistributed commercially, but the PuPu name and brand are not automatically included in those rights.

If you ship a modified fork, rename it and replace PuPu branding unless you have written permission to use the marks.

Support

About

PuPu is a lightweight, cross-platform desktop AI client that works with both local and cloud-hosted models. Whether you prefer running models on your own machine or connecting to providers like OpenAI and Anthropic, PuPu gives you a unified, elegant interface — your AI, your rules.

Topics

Resources

License

Stars

Watchers

Forks

Contributors