OneAIFW is a local, lightweight “AI firewall” that anonymizes sensitive data before sending it to LLMs, and restores it after responses.
- Core engine: Zig + Rust (WASM and native)
- JS binding: libs/aifw-js (Transformers.js + WASM aifw core)
- Python binding: libs/aifw-py (Python's transformers package + native aifw core)
- Demos: Web app(apps/webapp), Browser Extension(browser_extension)
- Backend service based on presidio/LiteLLM:
py-origin(FastAPI + Presidio/LiteLLM)
Privacy:
- Physical Address
- Email Address
- Name
- Phone
- Bank Account
- Paymant Information
Secrets:
- Verification Code
- Password
Crypto:
- Seed
- Private Key
- Address
The site at oneaifw.com is a full end‑to‑end demo of the OneAIFW project.
It showcases how OneAIFW detects sensitive information, anonymizes it before LLM calls, and restores it afterward, all in a browser‑friendly UI backed by the same core engine used in this repository.
core/Zig core and pipelines (mask/restore)libs/aifw-js/JavaScript library used in browser and demosapps/webapp/Minimal browser demo (Vite)browser_extension/Chrome/Edge extension samplepy-origin/Python backend service + CLI command (see its own README)tests/transformer-js/Transformers.js demo and model prep scripts
High‑level architecture (overview):
OneAIFW is built as a layered, cross‑platform stack around a single core library.
The aifw core library is designed to compile both as a native library (for backend services and CLIs) and as a WASM module (for browsers and JS runtimes), with language bindings providing ergonomic APIs on top.
-
Core engine: aifw core library (Zig + Rust)
Cross‑platform engine that implements the masking/restoring pipelines and regex/NER span fusion.
It builds to:- Native libraries for use from Python and other host languages.
- WASM (
wasm32-freestanding) for use in browsers and JS environments.
-
Language bindings
- aifw-js (
@oneaifw/aifw-js): JavaScript/TypeScript binding that runs NER with Transformers.js, converts spans to byte offsets, calls the WASM core to mask/restore, and exposes a high‑level API (including batch, language‑aware masking, and integration helpers for web apps and extensions). - aifw-py: Python binding that loads the native core library and exposes a simple API (
mask_text,restore_text, batch variants, and configuration) used by Python CLIs and HTTP services.
- aifw-js (
-
Backends and apps
- Web demo (
apps/webapp): Vite‑based frontend that talks to the core viaaifw-js, demonstrating end‑to‑end masking/restoring flows entirely in the browser or against a backend service. - Browser extension (
browser_extension): Example Chrome/Edge extension that injects OneAIFW into arbitrary pages to protect prompts before they are sent to LLM UIs. - Python services (
cli/python): HTTP APIs and CLI wrappers built on top ofaifw-py, suitable for running as local daemons or containerized services behind gateways. - Presidio based services (
py-origin): HTTP APIs built on top of presidio library, suitable for running as local daemons or containerized services behind gateways.
- Web demo (
-
Production demo website
The public demo at oneaifw.com is built from this stack and uses the same core engine and bindings described above.
- Zig 0.15.2
- Rust toolchain (stable) + Cargo
rustup target add wasm32-unknown-unknown
- Node.js 18+ and pnpm 9+
- Install pnpm:
npm i -g pnpm
- Install pnpm:
- Python 3.10+ (for
py-originbackend) and pip/venv
Verify versions:
zig version # expect 0.15.2
rustc --version
cargo --version
node -v
pnpm -v
python3 --versionOneAIFW lets you safely call external LLM providers by anonymizing sensitive data first, then restoring it after the model response. You can run it as a local HTTP service or use an in‑process CLI. Follow the steps below to get up and running quickly.
git clone https://github.com/funstory-ai/aifw.git
cd aifw- Build the aifw core library (native + WASM):
# From repo root
zig build- Install JS workspace dependencies (pnpm workspace):
pnpm -w install- Build the JavaScript library aifw-js (bundles and stages WASM/models), this library has a complete pipeline for mask/restore text with transformers.js:
pnpm -w --filter @oneaifw/aifw-js build- Run the web demo:
cd apps/webapp
pnpm dev
# open the printed local URL- Backend service and CLI based on presidio (see
py-origin/README.md):
cd py-origin
python -m venv .venv && source .venv/bin/activate
pip install -r services/requirements.txt -r cli/requirements.txt
python -m aifw launchThe py-origin project is a standalone subproject. All of its development and usage documentation is now maintained in py-origin/README.md.
For all configurable parameters, the resolution order is:
- Command-line arguments
- Environment variables
- Config file (
aifw.yaml)
For example, the LLM API key file is resolved as:
- CLI:
--api-key-file - Env:
AIFW_API_KEY_FILE - Config:
api_key_fileinaifw.yaml
The same precedence applies to port, logging options, etc.
We use Zig's build system to produce the native and WASM artifacts and to orchestrate Rust static libs.
High‑level targets:
- Core:
zig build(default) - Unit tests:
zig build -Doptimize=Debug test - Integration test exe:
zig build inttest(runzig-out/bin/aifw_core_test)
Environment variables during JS model preparation (used by tests/transformer-js tools):
ALLOW_REMOTE=1enables online model downloadsHF_TOKEN,HF_ENDPOINTfor Hugging FaceHTTP_PROXY,HTTPS_PROXY,NO_PROXYfor proxies
Examples:
zig build
zig build -Doptimize=Debug test
zig build inttest && zig-out/bin/aifw_core_testArtifacts are installed under zig-out/.
Debug builds keep symbols so crashes can show stack traces:
zig build -Doptimize=DebugRelease builds strip symbols by default.
Run Zig unit tests defined in core/aifw_core.zig:
zig build -Doptimize=Debug testExample output:
masked=Contact me: __PII_EMAIL_ADDRESS_00000001__ and visit __PII_URL_ADDRESS_00000002__
restored=Contact me: a.b+1@test.io and visit https://ziglang.org
An integration test is provided at tests/test-aifw-core/test_session.zig and built as a standalone executable:
# Build and run the integration test
zig build inttest
# Or run the installed binary directly after build
zig-out/bin/aifw_core_testThis test exercises the full mask/restore pipeline, including the Rust regex recognizers.
The aifw core library is implemented in Zig with two build targets: native and wasm32-freestanding. It integrates a Rust‑based regex engine (using regex-automata) compiled to static libraries (.a) for both native and WASM, then linked into the Zig library.
Highlights:
- Pipeline architecture with two pipelines:
maskandrestore. - Sessions hold configured components and allocators; pipelines are pure and side‑effect‑free.
- PII detection is produced by a composite of: RegexRecognizer (Rust regex via C ABI), NerRecognizer (external NER → spans), SpanMerger (merge, filter, and de‑duplicate spans)
- Rust regex is implemented with
regex-automataand exposed via a C ABI static library; it is built for native and WASM targets and linked into the Zig core. - Masking replaces sensitive spans with placeholders like
__PII_EMAIL_ADDRESS_00000001__and records minimal metadata; restoring reconstructs the original text using that metadata. - Placeholders are generated using a stack buffer (no heap); metadata stores only
(EntityType, serial_id)to minimize memory and avoid pointer issues. - Built and tested with Zig 0.15.1; Rust static libs are produced for native and
wasm32-unknown-unknown.
Example:
{
"openai-api-key": "xxxxxxxx.xxxx",
"openai-base-url": "https://api.openai.com/v1",
"openai-model": "gpt-4o-mini"
}- openai-api-key: Your LLM API key string used for authentication.
- openai-base-url: Base URL of an OpenAI-compatible endpoint (e.g., OpenAI, a gateway, or a vendor’s OpenAI-style API).
- openai-model: Default model identifier for requests (can be overridden internally as needed).
Note: Keys using underscores are also accepted (e.g., openai_api_key, openai_base_url, openai_model).
This repo uses a pnpm workspace (see pnpm-workspace.yaml) to manage all JS projects. Always install dependencies from the repo root:
pnpm -w installThe library bundles Transformers.js, copies ORT/AIFW WASM files, and stages configured NER models.
Requirements:
- Build the core first (
zig build) sozig-out/bin/liboneaifw_core.wasmexists - Prepare model files under
ner-models/or setAIFW_MODELS_DIR
Build:
# From repo root
pnpm -w --filter @oneaifw/aifw-js buildEnvironment variables affecting model copy (see libs/aifw-js/scripts/copy-assets.mjs):
AIFW_MODELS_DIR(defaults toner-models/)AIFW_MODEL_IDScomma‑separated model IDs (default:funstory-ai/neurobert-mini) The user must correctly set these two variables to build oneaifw/aifw-js project.
Model preparation (optional helpers):
tests/transformer-js/scripts/prep-models.mjsdownloads and organizes models for browser usage- Online:
ALLOW_REMOTE=1 node tests/transformer-js/scripts/prep-models.mjs - Honor
HF_TOKEN, proxy vars as needed
- Online:
pnpm -w --filter @oneaifw/aifw-js build
cd apps/webapp
pnpm devOffline build (copies library dist and produces aifw-offline.html):
cd apps/webapp
pnpm offlineServe with cross‑origin isolation helper (if needed for WASM threading):
pnpm run serve:coi
# open the printed URLSee browser_extension/README.md for packaging and loading into Chrome/Edge. In short:
pnpm -w --filter @oneaifw/aifw-js build
mkdir -p browser_extension/vendor/aifw-js
rsync -a --exclude 'models' libs/aifw-js/dist/* browser_extension/vendor/aifw-js
# then load the folder as an unpacked extensionThe backend service and CLI are in py-origin/. It provides HTTP APIs:
/api/call,/api/mask_text,/api/restore_text,/api/mask_text_batch,/api/restore_text_batch
Authentication:
- Standard
Authorizationheader; configure the key with envAIFW_HTTP_API_KEYor CLI option
Start here: py-origin/README.md.
This repository provides Docker images for running OneAIFW as a local HTTP service.
- For the legacy
py-originservice (FastAPI + Presidio/LiteLLM), see detailed Docker build instructions inpy-origin/README.md. - For the newer CLI‑based Python web server (based on
cli/python/aifw.pyanduvicorn), you can build and run a self‑contained image as follows.
From the repo root:
cd cli/python
docker build -t oneaifw:latest -f Dockerfile .This image bundles:
- The Zig aifw core native library.
- The
aifw-pyPython binding and its dependencies. - The
cli/pythonHTTP server entrypoint (aifw launchunder the hood).
Assuming your LLM API key JSON is at ~/.aifw/your-key.json on the host:
docker run --rm -p 8844:8844 \
-e AIFW_API_KEY_FILE=/data/aifw/your-key.json \
-v $HOME/.aifw:/data/aifw \
oneaifw:latestThe container will start the AIFW HTTP server on port 8844 inside the container (exposed to the host via -p 8844:8844).
You can then call the HTTP APIs or use the aifw CLI inside other containers pointing at this service.
You can provide the LLM API key file to the container via an environment variable and a bind mount. Two options:
- Put your key file inside your host work dir (
~/.aifw) and mount the directory:
# Ensure the key file is at ~/.aifw/your-key.json on host
docker run --rm -p 8844:8844 \
-e AIFW_API_KEY_FILE=/data/aifw/your-key.json \
-v $HOME/.aifw:/data/aifw \
oneaifw:latest- Or mount the key file directly to a path inside the container and point AIFW_API_KEY_FILE to it:
docker run --rm -p 8844:8844 \
-e AIFW_API_KEY_FILE=/data/aifw/your-key.json \
-v /path/to/api-keys/your-key.json:/data/aifw/your-key.json \
oneaifw:latestSince the Docker image’s default command already launches the HTTP server, you don’t need to run aifw launch manually. You can still execute other commands inside the running container:
- Run the OneAIFW docker image in interactive mode
docker run -it --name aifw \
-p 8844:8844 \
-e AIFW_API_KEY_FILE=/data/aifw/your-key.json \
-v $HOME/.aifw:/data/aifw \
oneaifw:latest \
/bin/bash- Start the OneAIFW server
# Use the CLI interface of OneAIFW inside container
python -m aifw launch- Call the OneAIFW for translate text or do other things
# Use the CLI interface of OneAIFW inside container
python -m aifw call "请把如下文本翻译为中文: My email address is test@example.com, and my phone number is 18744325579."- Stop the OneAIFW server
# Use the CLI interface of OneAIFW inside container
python -m aifw stop- Exit the OneAIFW docker and Cleanup resources
exit
docker rm -f aifw