This guide covers running SecAI OS services locally for development and testing, without building a full OS image.
- Go 1.25+ for building Go services
- Python 3.12 recommended for running Python services. CI and lockfiles use Python 3.12; package metadata still allows Python 3.11 where scanner compatibility requires it.
- pip for Python dependency management
- git for version control
- make (optional, for convenience targets)
git clone https://github.com/SecAI-Hub/SecAI_OS.git
cd SecAI_OSEach Go service is in its own directory under services/. To exercise the same service set as CI:
make test-goTo build or run an individual service, enter that service directory and use the normal Go tooling:
cd services/registry
go build -o registry .
./registryThe Go service set is: airlock, registry, tool-firewall, gpu-integrity-watch, mcp-firewall, policy-engine, runtime-attestor, integrity-monitor, and incident-recorder.
python -m pip install -r requirements-ci.txt
python -m pip install -r services/agent/requirements.txt
python -m pip install -r services/search-mediator/requirements.txt
python -m pip install --require-hashes -r services/ui/requirements.lock
python -m pip install --require-hashes -r services/quarantine/requirements.lock
python -m pip install -e services/agent -e services/ui -e services/quarantinecd services/ui
python -m ui.appThe UI listens on port 8480. Open http://localhost:8480 in a browser.
The quarantine pipeline runs as a watcher service that monitors the quarantine directory:
cd services/quarantine
python -m quarantine.watchercd services/search-mediator
python app.pyThe search mediator listens on port 8485. Requires a running SearXNG instance and Tor for full functionality.
make test-go# All Python tests
PYTHONPATH=services python -m pytest tests/ -v
# Specific test suites
PYTHONPATH=services python -m pytest tests/test_quarantine_pipeline.py -v
PYTHONPATH=services python -m pytest tests/test_ui.py -v
PYTHONPATH=services python -m pytest tests/test_memory_protection.py -v
PYTHONPATH=services python -m pytest tests/test_differential_privacy.py -v
PYTHONPATH=services python -m pytest tests/test_traffic_analysis.py -vServices look for configuration files in the following order:
- Path specified by environment variable (e.g.,
SECAI_POLICY_PATH) ./policy.yamlin the current working directory/etc/secure-ai/policy/policy.yaml(production path, unlikely to exist in dev)
For development, copy the default policy file:
cp files/system/etc/secure-ai/policy/policy.yaml ./policy.yamlEdit policy.yaml to adjust settings for your dev environment.
When running services directly (outside of the full OS image), the following security features are not active:
- Systemd sandboxing: ProtectSystem, ProtectHome, PrivateTmp, NoNewPrivileges, and other systemd hardening directives only apply when services run under systemd.
- nftables firewall: Network rules are not applied in dev mode. Services can make arbitrary network connections.
- Seccomp-BPF filters: System call filtering requires the systemd service units.
- Landlock LSM: Filesystem access restrictions require the systemd service units.
- Encrypted vault: The LUKS encrypted volume is not present in dev mode. Models are stored in plain directories.
- Read-only root: The immutable filesystem is a property of the OS image, not the services.
Dev mode is for development and testing only. Do not use dev mode for processing sensitive data or running untrusted models.
If you do not have llama.cpp installed or do not need actual inference:
- The UI, registry, and tool firewall can run independently.
- Chat and generation endpoints will return errors without an inference worker.
- Model management (import, quarantine, promote) works without inference.
To set up llama-server for local inference:
# Build llama.cpp
git clone https://github.com/ggerganov/llama.cpp.git
cd llama.cpp
make -j$(nproc)
# Start the server with a model
./llama-server -m /path/to/model.gguf --port 8081