Talk to AI-powered NPCs in Pokemon games using local LLMs.
Press SELECT in-game to speak with mysterious characters who respond dynamically using a local language model. No internet required, no API keys, fully offline.
┌─────────────┐ ┌─────────────┐ ┌─────────────┐
│ mGBA │ │ bridge.py │ │ llama.cpp │
│ Emulator │─────▶│ Python │─────▶│ Server │
│ + Lua │◀─────│ Bridge │◀─────│ (LLM) │
└─────────────┘ └─────────────┘ └─────────────┘
IPC via files HTTP to localhost:8080
The Lua script captures button presses in mGBA, writes player messages to a file, and the Python bridge sends them to a local llama.cpp server. Responses come back through the same pipeline.
Three NPCs with distinct personalities:
- Professor Hemlock — Cryptic researcher who speaks in riddles about legendary Pokemon and ancient bonds
- Vera — Pattern-obsessed astrologer who sees connections everywhere, casts horary charts, and mutters about what the Unown are really spelling
- Iris — Ageless starseed who attracts synchronicities, sees angel numbers, and leaves trainers feeling lighter
Characters are selected based on keywords in your message, or randomly if no keywords match.
- macOS (tested on Apple Silicon)
- llama.cpp with a GGUF model
- mGBA emulator with Lua scripting support
- Python 3 with
requests - Pokemon ROM (Crystal or Emerald)
git clone https://github.com/ggerganov/llama.cpp
cd llama.cpp && cmake -B build && cmake --build build -j
mkdir models && cd models
# Download TinyLlama (fast, ~637MB) or a larger model for better quality
curl -LO https://huggingface.co/TheBloke/TinyLlama-1.1B-Chat-v1.0-GGUF/resolve/main/tinyllama-1.1b-chat-v1.0.Q4_K_M.ggufgit clone https://github.com/erosika/pokemon-llm
cd pokemon-llmpython3 -m venv .venv
.venv/bin/pip install requestsbrew install mgbaBuild from source using pret/pokecrystal or pret/pokeemerald, or use your own legally obtained ROM.
mkdir roms
# Place your .gbc or .gba file in the roms folderEdit run-server.sh, run-bridge.sh, and run-emulator.sh to match your local paths.
Open three terminal windows:
Terminal 1 — LLM Server:
./run-server.shTerminal 2 — Python Bridge:
./run-bridge.shTerminal 3 — Emulator:
./run-emulator.shPress SELECT in-game to talk to the LLM. Responses appear in the mGBA console.
- Pokemon Crystal (Game Boy Color)
- Pokemon Emerald (Game Boy Advance) — in progress
| Model | Size | Speed | Quality |
|---|---|---|---|
| TinyLlama 1.1B | ~637MB | Very fast | Chaotic but fun |
| Llama 3.2 3B | ~2GB | Fast | Good |
| Mistral 7B | ~5GB | Medium | Great |
| Llama 3.1 8B | ~6GB | Slower | Excellent |
Smaller models run faster but produce more erratic responses. Larger models need more RAM.
- Display responses in actual game text boxes
- NPCs that appear in the overworld
- Location-aware dialog
- Party Pokemon awareness
- Quest generation based on game state
MIT