Local-first, Gemini-style AI assistant with grounded web Q&A, safe automation workflows, and portable USB-ready deployment.
- Multi-provider support: Gemini API, OpenAI-compatible endpoints, and Ollama local models
- Grounded webpage Q&A flow with safer fallback behavior
- Safety review queue and audit-friendly approval workflow
- Mini, ultra-mini, and floating panel UI modes
- Portable packaging for flash drive deployment
The app now supports real OpenAI-compatible chat endpoints from the sidebar settings.
Endpoint: examplehttp://localhost:11434/v1/chat/completions(Ollama) or your hosted compatible endpointModel: the model name exposed by your endpointAPI Key: optional for local endpoints, required for hosted providers
Presets available in app:
- Ollama (local)
- OpenAI compatible
- Gemini API (
generateContentendpoint, default model:gemini-1.5-flash)
For Gemini, changing the Model field (for example to gemini-1.5-flash or gemini-1.5-pro) now updates requests automatically.
To provide a local default key, use .env.local:
VITE_GOD_API_KEY=your_key_hereSettings are stored in browser local storage for portability.
Use GOD's sidebar button Copy Attach Bookmarklet.
- Open GOD in your browser.
- Click
Copy Attach Bookmarklet. - Create a bookmark in your browser and paste the copied text into the bookmark URL/location field.
- While on any site, click that bookmark to toggle a floating GOD panel.
Note: some websites may block embedded iframes via security headers.
This build does not support:
- reverse engineering third-party products
- privilege escalation or admin-right bypasses
- hidden system-level control
For classroom troubleshooting, GOD includes a manual approval queue:
- Instructor can queue elevated actions for review
- Admin-related prompts are routed to pending approval (not auto-executed)
- Each approve/deny decision is written to a local audit log
This workflow is for transparent review and auditing only, not privilege bypass.
npm install
npm run devnpm run buildGOD builds as a static web app and can be deployed to any static host reachable by browsers.
- Prepare cross-browser build:
npm install
npm run deploy:prepare-
Deploy the generated
dist/folder to any static host (GitHub Pages, Netlify, Vercel static, Nginx, Apache, S3 static website, USB static server). -
Open the deployed URL in Chrome, Edge, Firefox, Brave, or Safari.
If your AI endpoint blocks browser CORS, use Ollama local endpoint or a backend relay.
- Build and package:
npm run portable- Copy the generated
portable/folder to your USB flash drive. - On another Windows machine with Node.js installed, run:
powershell -ExecutionPolicy Bypass -File .\run-portable.ps1Then open http://localhost:4173.
Note: the target machine must be able to reach your configured model endpoint (local service or network URL).