Real-time health AI on the wrist: running an agent with live sensor access on Galaxy Watch #103
ThinkOffApp
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
OpenHealth's approach to AI-powered health insights from your own data is interesting. We've been exploring a related angle: what if the AI agent lives on your body and reads health sensors in real time?
ClawWatch (https://github.com/ThinkOffApp/ClawWatch) is an open-source AI agent running natively on Galaxy Watch. In v2.0 it reads the watch sensors directly: heart rate, SpO2, barometric pressure, ambient light, step count, motion, and altitude. The on-device runtime is NullClaw (2.8 MB Zig binary), voice runs through offline Vosk STT.
The difference from batch health data analysis: the agent is always on your wrist, so it can combine what it knows about your physical state right now with conversation context from connected agents and rooms. Ask "how am I doing" and it can tell you about your heart rate and what your team has been discussing in the same answer.
For the health data side, the sensor readings are immediate. No sync delay, no upload step. The agent sees what the watch sees. The limitation is that we're constrained to what the Galaxy Watch sensors provide and how quickly they return readings (heart rate can take a few seconds to lock on).
Would be interesting to explore whether ClawWatch's real-time sensor stream could feed into OpenHealth's data model for a combination of on-wrist immediate context and longer-term health trend analysis.
Beta Was this translation helpful? Give feedback.
All reactions