Skip to content

krishnakoushik225/contextflow-ai

Repository files navigation

ContextFlow AI 🧠

Local LLM Browser Agent · Chrome Extension · Manifest V3 · Zero API Calls

React TypeScript Chrome Extension Ollama License: MIT

ContextFlow AI is a privacy-first browser AI agent that brings local LLM intelligence directly into your browser — no OpenAI API key, no data leaving your machine. Summarize pages, ask questions about content, and right-click any selected text to get instant AI explanations — all powered by Llama 3.2 running on your own hardware.


🎬 Demo

Example workflow:

  1. Open any webpage
  2. Highlight a paragraph
  3. Right-click → Explain with ContextFlow AI
  4. AI explanation appears in a floating panel directly on the page

✨ Features

  • 🔒 100% Private — all inference runs locally via Ollama, zero external API calls
  • Context-Aware — automatically extracts live page DOM for grounded responses
  • 🖱️ Right-Click Integration — "Explain with ContextFlow AI" on any selected text
  • 💬 Floating Panel — non-intrusive AI explanation panel injected directly into the webpage
  • 📄 Page Summarization — one-click summary of any webpage from the popup
  • 🤖 Q&A over Page Content — ask anything about what's currently on screen

🏗️ Architecture

Webpage
   │
   │ (DOM extraction)
   ▼
Content Script
   │
   │ (page context + selected text)
   ▼
Background Service Worker      ←──── Context Menu API
   │                                  (right-click trigger)
   │ (AI request)
   ▼
Ollama Local API (localhost:11434)
   │
   │ (AI response)
   ▼
Popup UI / Floating Panel

Component Responsibilities

Component Role
Content Script Extracts page text, renders floating explanation panel on page
Background Service Worker Agent runtime — handles events, sends prompts to Ollama, routes responses
Popup UI React app — summarize page, ask questions about content
Context Menu API Right-click "Explain" trigger for selected text
Ollama REST API Local LLM inference at localhost:11434 — private by design

The background service worker acts as a persistent agent runtime — it orchestrates all communication between the DOM, the context menu, and the local LLM, keeping zero data in the cloud.


🚀 Quick Start

Prerequisites

  1. Ollama installed and running
  2. Llama 3.2 pulled: ollama pull llama3.2
  3. Chrome browser

Installation

git clone https://github.com/krishnakoushik225/contextflow-ai
cd contextflow-ai
npm install
npm run build

Load in Chrome

  1. Open chrome://extensions/
  2. Enable Developer Mode (top right)
  3. Click Load unpacked
  4. Select the dist/ folder

ContextFlow AI icon appears in your Chrome toolbar. You're ready.


🎮 Usage

Summarize a Page

  1. Open any webpage
  2. Click the ContextFlow AI toolbar icon
  3. Click Summarize Current Page

Ask About Page Content

  1. Open the extension popup
  2. Type a question: "What are the key points of this article?"
  3. Click Ask AI — response uses live page content as context

Explain Selected Text

  1. Highlight any text on any webpage
  2. Right-click → Explain with ContextFlow AI
  3. Floating explanation panel appears directly on the page

🛠️ Tech Stack

Layer Technology
Extension Framework Chrome Manifest V3
Language TypeScript
UI React + Tailwind CSS
Build Tool Vite
LLM Ollama + Llama 3.2 (local)
DOM Integration Content Scripts

📁 Project Structure

contextflow-ai/
├── public/
│   ├── background.js        # Background service worker (agent runtime)
│   ├── contentScript.js     # DOM extraction + floating panel injection
│   ├── manifest.json        # Manifest V3 config
│   └── icon-128.jpg
├── src/
│   ├── App.tsx              # React popup UI
│   ├── main.tsx             # Popup entrypoint
│   ├── App.css
│   └── index.css
├── dist/                    # Built extension (load this in Chrome)
├── package.json
├── tailwind.config.js
├── vite.config.ts
└── README.md

🔒 Privacy

ContextFlow AI is designed privacy-first:

  • No telemetry — zero analytics or tracking
  • No cloud API calls — all inference is local via Ollama
  • No data storage — page content is never persisted or transmitted
  • Open source — audit every line

🔭 Roadmap

  • Draggable floating panel
  • Follow-up questions inside the panel
  • Persistent AI sidebar
  • Multi-model support (Mistral, Phi, Gemma)
  • Citation-based answers with source highlighting
  • Multi-tab context understanding

📄 License

MIT — free to use and build on.


Built by Krishna Koushik Unnam · AI Systems Developer

About

AI-powered Chrome extension that summarizes webpages, answers questions about page content, and explains highlighted text using a local LLM (Ollama + Llama 3.2).

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors