Skip to content

OmChachad/ARScribe

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

35 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ARScribe App Icon

ARScribe

ARScribe turns AR glasses into subtitles for the real world.

It provides fast, private, on-device live transcription and translation, then presents the result on a connected AR display so spoken conversation can be read as captions in front of you.

Overview

ARScribe is an iOS app built for AR glasses connected to an iPhone or iPad. It listens to live speech through the device microphone, transcribes it on device, and can translate finalized speech into another language before displaying it as readable captions in the connected glasses.

The app keeps core speech processing local to the device for privacy and responsiveness. Speed and latency depends on the device model you use.

Features

  • Live on-device speech transcription
  • Live on-device translation
  • Adjustable subtitle position and text scale
  • Source and target language selection
  • Automatic language asset download when required by the OS
  • Privacy-first design with processing handled on device

Supported Languages

Any languages supported by Apple’s SpeechAnalyzer and Translate framework are supported.

Privacy

ARScribe is designed around on-device processing. Speech recognition and translation are handled locally through Apple system frameworks (SpeechAnalyzer + Translate), helping keep conversations private while maintaining fast response times.

Limitations

Currently, ARScribe will transcribe your own voice, too. This is however redundant and will be fixed in an upcoming release.

Requirements

  • iOS 26.0 or later
  • Xcode 26.0 or later
  • An iPhone or iPad
  • AR glasses that can connect to your iPhone or iPad as an external display
  • Microphone permission enabled
  • Speech language assets installed when prompted by the system

Compatibility Warning

Warning

Tested on XREAL Glasses. Compatible with any AR glasses that can connect to your iPhone or iPad.

How It Works

  1. Connect supported AR glasses to your iPhone or iPad.
  2. Launch ARScribe.
  3. The glasses act as the subtitle display.
  4. Start recording from the main screen.
  5. Spoken audio is transcribed on device in real time.
  6. If translation is enabled, finalized transcript blocks are translated before display.
  7. Captions appear in the AR view with adjustable size and vertical placement.

Running the Project

  1. Open ARScribe.xcodeproj in Xcode.
  2. Select an iOS 26.0+ device.
  3. Build and run the ARScribe target.
  4. Connect your AR glasses before starting a session.

In-App Controls

  • Toggle Captions: Starts or stops live transcription
  • Translation: Enables translation and lets you choose source and target languages
  • Positioning: Adjusts subtitle vertical offset and text scale for the AR display

Notes

  • If AR glasses are not connected, the app shows a connection screen instead of the subtitle experience.
  • For the best experience, the app suggests disabling Anchor mode on compatible glasses.
  • The app may download required speech assets the first time a language is used.
  • Translation only applies when source and target languages differ.

Tech Stack

  • Swift
  • SwiftUI
  • Apple Speech framework
  • Apple Translation framework
  • AVFoundation

Project Structure

Origin

ARScribe started as a hackathon project at a local hackathon in LA, Jewel City Hacks. It won “The Best Overall Hack,” “Best Hardware Hack,” and the “Top 5 Hackers” award at the hackathon.

License

This project is released under the CC0 1.0 Universal license. See LICENSE for details.

About

Subtitles for the real world. Live translation and transcription for AR Glasses, fully on-device.

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages