MobileClaw is a mobile-first chat UI for the OpenClaw agent platform. Built with Next.js 16, Tailwind CSS v4, and zero component libraries. The UI has been modularized into focused components.
- Next.js 16 (App Router, Turbopack)
- Tailwind CSS v4 with OKLch color tokens
- TypeScript (strict mode disabled for speed)
- Geist font via
geistpackage (sans + mono) - pnpm — package manager (do NOT use npm or create package-lock.json)
- Vitest for unit testing
- No component library — all UI is hand-rolled with inline SVG icons
app/
page.tsx — main page, state management, backend switching
layout.tsx — root layout, fonts, metadata
globals.css — Tailwind config, OKLch color tokens, dark mode
LocatorProvider.tsx — dev-only tree locator (v0 tooling)
api/lmstudio/ — proxy for LM Studio API calls
components/
ChatInput.tsx — message input with command autocomplete
CommandSheet.tsx — slash command picker
MessageRow.tsx — individual message rendering
SetupDialog.tsx — backend selection (OpenClaw/LM Studio/Demo)
StreamingText.tsx — animated text streaming
ThinkingIndicator.tsx — reasoning/thinking display
ToolCallPill.tsx — tool call status badges
ImageThumbnails.tsx — image attachment previews
lib/
lmStudio.ts — LM Studio OpenAI-compatible client with SSE streaming
useWebSocket.ts — WebSocket hook for OpenClaw with reconnect backoff
demoMode.ts — demo mode handler, mock history, keyword responses
toolDisplay.ts — maps tool names/args to human-friendly labels
messageUtils.ts — message content helpers
notifications.ts — push notification support
utils.ts — general utilities
types/
chat.ts — shared TypeScript types (Message, ContentPart, etc.)
pnpm install
pnpm run dev # http://localhost:3000 (Turbopack)
pnpm run build # production build
pnpm test # run Vitest tests (63 tests)- When developing features for iOS app ONLY run
make build-webafter completing the task. - Run
make pr-commentsto review current PR comments.
MobileClaw supports three backend modes, selectable in the setup dialog:
- Connects to OpenClaw gateway via WebSocket
- Full agent capabilities, tool execution, reasoning streams
- Requires URL and optional auth token
- Connects to local LM Studio server (OpenAI-compatible API)
- Supports
<think>...</think>tag parsing for reasoning models - Auto-detects models that skip opening
<think>tag onStreamStartfires after HTTP 200 (confirms server is processing)
- Fully client-side simulation, no server required
- Visit
localhost:3000?demoto auto-enter - Or leave URL empty in setup dialog and click "Start Demo"
- Keywords trigger different responses: "weather", "code", "think", "error", "research", "agent", "help"
- Start LM Studio and load a model
- Enable the local server (default:
http://localhost:1234) - Run
pnpm run dev - In setup dialog, select "LM Studio", enter
http://localhost:1234 - Select your model from the dropdown
- Send a message — "Thinking..." appears when server accepts request
- Modular components: UI split into focused components in
components/ - Shared types: all types in
types/chat.ts - No component library: use raw HTML elements + Tailwind classes
- Inline SVG icons: no icon library — copy SVG directly into JSX
- OKLch colors: all color tokens in
globals.cssuseoklch()— never use hex or named colors - Mobile-first:
h-dvhviewport, touch handlers, iOS Safari fixes - CSS variable animations: scroll morph bar uses
--spCSS custom property for 60fps animations
- PR #3: Expandable ChatInput textarea (auto-grows with content)
- PR #4: Comprehensive test suite (63 Vitest tests covering components, utils, handlers)
- PR #5: Push notifications when agent completes (with iOS PWA safety)
- PR #6: Migrated from next/font/google to
geistpackage
- LM Studio
onStreamStartnow fires after HTTP 200 (accurate "Thinking..." timing) - WebSocket handler refactored into sub-handlers for cleaner code
- iOS keyboard layout fixes for Safari PWA
- Font CSS variables properly reference geist package vars
- Requests permission on first message send
- Notifies when agent finishes responding (if tab not focused)
- Safe try/catch wrapper for iOS PWA edge cases
- See
lib/notifications.ts
MobileClaw connects to OpenClaw's gateway WebSocket. Protocol frames:
- Server sends
event:connect.challengewith nonce - Client responds with
req:connectincluding auth token, capabilities - Server responds with
res:hello-okincluding server info, session snapshot - Client requests
req:chat.historyto load message history - Messages flow via
event:chat(delta/final/aborted/error) andevent:agent(content/tool/reasoning/lifecycle streams) - Client sends
req:chat.sendwith user messages