ReactNativeLLM is a reference implementation for building offline-capable conversational AI experiences on iOS and Android with a single TypeScript codebase. It combines on-device inference, contextual retrieval, and a polished chat interface into a modular foundation that can be extended for production deployments.
- Offline-first pipeline powered by
react-native-aifor running quantized language models on device. - Context orchestration that segments Markdown knowledge bases and surfaces relevant snippets per turn.
- A clean React Native architecture with screen-level stores, reusable UI primitives, and typed service layers.
- Documentation site (Docusaurus) covering integration guides, API contracts, and operations playbooks.
An annotated product walkthrough is bundled as demo.MP4 in the repository root.
- Model lifecycle management – download, cache, and activate models with progress feedback and network awareness.
- Conversational workspace – Gifted Chat integration, adaptive theme support, and context toggles for human-in-the-loop control.
- Context intelligence – Markdown parsing, embedding-friendly chunking, and relevance scoring via Fuse.js.
- Resilience features – persisted session state, graceful fallback paths when context is unavailable, and telemetry hooks for diagnostics.
src/screenshosts high-level navigation containers for model selection and chat workflows.src/componentsprovides composable UI focused on chat ergonomics and control surfaces.src/hooksencapsulates domain-specific state machines (model preparation, context refresh, connectivity).src/servicespackages file system utilities, context processing logic, and adapters to the model control plane.src/thememanages cross-platform visual styling using a context-driven design token approach.
Refer to docs/ for an extended architectural deep dive and sequence diagrams.
- Node.js 18 or later
- React Native CLI environment (Xcode/iOS Simulator on macOS, Android Studio + SDK/NDK)
- CocoaPods (macOS) for iOS dependencies
git clone <repository-url>
cd ReactNativeLLM
npm install # or: yarn install
# iOS only
(cd ios && pod install)# Start Metro in a dedicated terminal
npm start
# iOS simulator
npm run ios
# Android emulator or attached device
npm run androidAdditional scripts: npm test, npm run lint, and npx tsc --noEmit for unit tests, linting, and static analysis respectively.
- Use the
Model Selectionscreen to download or activate a model before entering the chat experience. - Long-press the context toggle to generate a sample
context.mdwhile prototyping. - Store curated Markdown knowledge in the documents directory; the context manager will reindex on demand.
- Observe network status via the header indicator to gauge whether downloads are possible.
- Testing – Jest harness with React Native Testing Library (see
__tests__/). - Linting & Formatting – ESLint with the React Native recommended baseline plus Prettier.
- Type Safety – TypeScript strict mode across screens, hooks, and services.
- CI Ready – Scripts are structured for easy adoption in GitHub Actions or Bitrise pipelines.
- Product and integration guides live in
docs/. - The Docusaurus site in
website/can be served locally withnpm startafter installing dependencies inside that directory. - API references are grouped by components, hooks, and services to simplify onboarding for new contributors.
- Model lifecycle enhancements (removal, version pinning, cloud sync).
- In-app context editing and Markdown validation.
- Export pipelines for curated conversations and context audit trails.
- Accessibility review including screen reader flows and high-contrast themes.
- Optional server-side relay for hybrid on-device/cloud inference.
- All inference occurs on device; no prompts or responses are transmitted externally by default.
- Context files remain in the application sandbox and can be rotated by the end user.
- Review
mlc-config.jsonand related platform files before shipping to production stores.