ARScribe turns AR glasses into subtitles for the real world.
It provides fast, private, on-device live transcription and translation, then presents the result on a connected AR display so spoken conversation can be read as captions in front of you.
ARScribe is an iOS app built for AR glasses connected to an iPhone or iPad. It listens to live speech through the device microphone, transcribes it on device, and can translate finalized speech into another language before displaying it as readable captions in the connected glasses.
The app keeps core speech processing local to the device for privacy and responsiveness. Speed and latency depends on the device model you use.
- Live on-device speech transcription
- Live on-device translation
- Adjustable subtitle position and text scale
- Source and target language selection
- Automatic language asset download when required by the OS
- Privacy-first design with processing handled on device
Any languages supported by Apple’s SpeechAnalyzer and Translate framework are supported.
ARScribe is designed around on-device processing. Speech recognition and translation are handled locally through Apple system frameworks (SpeechAnalyzer + Translate), helping keep conversations private while maintaining fast response times.
Currently, ARScribe will transcribe your own voice, too. This is however redundant and will be fixed in an upcoming release.
- iOS 26.0 or later
- Xcode 26.0 or later
- An iPhone or iPad
- AR glasses that can connect to your iPhone or iPad as an external display
- Microphone permission enabled
- Speech language assets installed when prompted by the system
Warning
Tested on XREAL Glasses. Compatible with any AR glasses that can connect to your iPhone or iPad.
- Connect supported AR glasses to your iPhone or iPad.
- Launch ARScribe.
- The glasses act as the subtitle display.
- Start recording from the main screen.
- Spoken audio is transcribed on device in real time.
- If translation is enabled, finalized transcript blocks are translated before display.
- Captions appear in the AR view with adjustable size and vertical placement.
- Open
ARScribe.xcodeprojin Xcode. - Select an iOS 26.0+ device.
- Build and run the
ARScribetarget. - Connect your AR glasses before starting a session.
Toggle Captions: Starts or stops live transcriptionTranslation: Enables translation and lets you choose source and target languagesPositioning: Adjusts subtitle vertical offset and text scale for the AR display
- If AR glasses are not connected, the app shows a connection screen instead of the subtitle experience.
- For the best experience, the app suggests disabling Anchor mode on compatible glasses.
- The app may download required speech assets the first time a language is used.
- Translation only applies when source and target languages differ.
- Swift
- SwiftUI
- Apple Speech framework
- Apple Translation framework
- AVFoundation
ARScribe/Views: Main UI, subtitle display, settings, and AR-glasses connection stateARScribe/Models: App state, transcription flow coordination, translation flow, and external display managementARScribe/Recording and Transcription: Audio capture and speech transcription pipelineARScribe/Helpers: Language helpers and audio buffer utilities
ARScribe started as a hackathon project at a local hackathon in LA, Jewel City Hacks. It won “The Best Overall Hack,” “Best Hardware Hack,” and the “Top 5 Hackers” award at the hackathon.
This project is released under the CC0 1.0 Universal license. See LICENSE for details.