-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Description
Summary
Creature mood, animations, and app background on the phone should be controlled by the server, which gets this information from Claude. Currently the creature visuals are static - instead, Claude should influence how the creature looks, behaves, and what environment it's in based on conversation context.
Proposed Flow
- Claude - During responses, Claude outputs mood/emotion metadata and environment info (e.g. happy, thinking, confused, excited + forest, space, ocean)
- Server - Parses mood/environment from Claude's output and sends it to connected clients via WebSocket (new message types, e.g.
creature_mood,creature_background) - Phone app - Receives updates and dynamically adjusts
CreatureViewanimations, colors, particle effects, expressions, and the app background
Creature Mood Examples
- User asks a complex question -> Claude is "thinking" -> creature shows contemplative animation
- Claude successfully completes a task -> "happy" mood -> creature bounces, glows brighter
- Claude encounters an error -> "confused" mood -> creature shows puzzled expression
- Idle state -> "calm" mood -> gentle breathing animation
Background Examples
- Claude is working on code -> dark terminal-like background
- Casual conversation -> warm gradient background
- Claude is debugging -> red-tinted alert background
- Night time / idle -> starry sky background
Open Questions
- How should Claude communicate mood and background? (structured output field, tool call, inferred from response tone)
- What set of moods/states and backgrounds should be supported?
- Should mood and background transitions be smooth or instant?
- Should backgrounds be predefined themes or fully dynamic (colors, gradients, particle systems)?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels