A collection of on-device AI primitives for React Native with first-class Vercel AI SDK support. Run AI models directly on users' devices for privacy-preserving, low-latency inference without server costs.
- π Instant AI - Use built-in system models immediately without downloads
- π Privacy-first - All processing happens on-device, data stays local
- π― Vercel AI SDK compatible - Drop-in replacement with familiar APIs
- π¨ Complete toolkit - Text generation, embeddings, transcription, speech synthesis
Native integration with Apple's on-device AI capabilities:
- Text Generation - Apple Foundation Models for chat and completion
- Embeddings - NLContextualEmbedding for 512-dimensional semantic vectors
- Transcription - SpeechAnalyzer for fast, accurate speech-to-text
- Speech Synthesis - AVSpeechSynthesizer for natural text-to-speech with system voices
npm install @react-native-ai/apple
No additional linking needed, works immediately on iOS devices (autolinked).
import { apple } from '@react-native-ai/apple'
import {
generateText,
embed,
experimental_transcribe as transcribe,
experimental_generateSpeech as speech
} from 'ai'
// Text generation with Apple Intelligence
const { text } = await generateText({
model: apple(),
prompt: 'Explain quantum computing'
})
// Generate embeddings
const { embedding } = await embed({
model: apple.textEmbeddingModel(),
value: 'Hello world'
})
// Transcribe audio
const { text } = await transcribe({
model: apple.transcriptionModel(),
audio: audioBuffer
})
// Text-to-speech
const { audio } = await speech({
model: apple.speechModel(),
text: 'Hello from Apple!'
})
Feature | iOS Version | Additional Requirements |
---|---|---|
Text Generation | iOS 26+ | Apple Intelligence device |
Embeddings | iOS 17+ | - |
Transcription | iOS 26+ | - |
Speech Synthesis | iOS 13+ | iOS 17+ for Personal Voice |
See the Apple documentation for detailed setup and usage guides.
Run any open-source LLM locally using MLC's optimized runtime. Currently in development and not recommended for production use.
npm install @react-native-ai/mlc
Requires manual setup including installing the MLC CLI. See the setup guide.
import { getModel, prepareModel } from '@react-native-ai/mlc'
import { generateText } from 'ai'
// Download and prepare model
await prepareModel('Llama-3.2-3B-Instruct')
// Generate response with Llama via MLC engine
const { text } = await generateText({
model: getModel('Llama-3.2-3B-Instruct'),
prompt: 'Explain quantum computing'
})
Note
MLC support is experimental. Follow the setup guide for detailed installation instructions.
Support for Google's on-device models is planned for future releases.
Comprehensive guides and API references are available at react-native-ai.dev.
Read the contribution guidelines before contributing.
react-native-ai is an open source project and will always remain free to use. If you think it's cool, please star it π.
Callstack is a group of React and React Native geeks, contact us at [email protected] if you need any help with these or just want to say hi!
Made with create-react-native-library