The Cortex Agent is a professional-grade intelligence system acting as a bridge between Telegram, Notion, and Google Gemini AI. It uses a hybrid desktop architecture to provide real-time task extraction, deduplication, and context-aware intelligence.
The application runs as a Tauri desktop app with a bundled Python Sidecar:
- Frontend (Tauri/Rust): Handles the OS window, system tray, and process lifecycle management. It launches and monitors the Python agent.
- Backend (Python Sidecar): The "Brain". Runs as a subprocess (
cortex-agent).- Listener: Connects to Telegram (MTProto) to intercept messages.
- Agent: Uses Google Gemini 2.0 to analyze message priority and extract tasks.
- NotionSync: Synchronizes high-priority items (P1-P3) to a Notion Database.
- Server: Exposes a local FastAPI implementation (Port 8000) for the UI dashboard.
- Node.js & npm
- Rust (Cargo)
- Python 3.11+
- Notion Integration Token & Database ID
- Telegram API ID & Hash
-
Clone & Install Dependencies:
git clone <repo> npm install
-
Setup Python Backend:
cd backend python3 -m venv venv source venv/bin/activate pip install -r requirements.txt
-
Environment Configuration: Create a
.envfile inbackend/or rely on the Setup Wizard (first run) to create~/.cortex/config.json.Key Variables:
API_ID,API_HASH: Telegram AuthNOTION_TOKEN,NOTION_DATABASE_ID: Notion AccessGENAI_KEY: Google AI Key
To start the full stack (Tauri App + Python Sidecar):
npm run tauri dev- The Python binary is automatically rebuilt via
backend/build.pyif sources change. - Logs are written to
~/.cortex/cortex.log.
The backend includes a comprehensive test suite using pytest.
-
Activate Backend Environment:
cd backend source venv/bin/activate
-
Run Tests:
pytest tests
To run the agent locally (without the Tauri frontend) to verify behavior:
- Ensure Environment is Active:
source venv/bin/activate - Run Main Script:
This will start the Telegram Listener and the Dashboard API server at
python main.py
http://localhost:8000.
The agent uses a Hybrid Defense strategy to prevent duplicate tasks:
- Layer 1 (Local Cache): An in-memory cache blocks duplicates instantly (0ms latency).
- Layer 2 (Verified Search): A synchronous check against Notion's
searchAPI ensures cross-session consistency, filtering out fuzzy matches manually.
- Universal Outgoing Filter: Ignores all messages sent by you (DMs & Groups) to prevent echo.
- Strict Triage:
- P1-P3: Created as Notion Tasks.
- P4 (Ignore): Polite chatter ("Thanks", "Ok") is logged to Audit but never creates a task.
- P5 (Spam): Completely discarded.
To prevent a flood of old alerts on restart, the agent enforces a "Time Guard":
- Default: Messages older than 120 seconds are ignored.
- Configuration: Change
CATCH_UP_SECONDSin~/.cortex/config.jsonor via the Settings Dashboard.
cortex-desktop/
├── src-tauri/ # Rust/Tauri Core
├── src/ # Frontend Assets (Loading Screen)
├── backend/ # Python Source Code
│ ├── listener.py # Telegram Event Loop
│ ├── agent.py # Gemini AI Wrapper
│ ├── notion_sync.py # Notion API Logic
│ ├── server.py # FastAPI Dashboard Backend
│ └── build.py # PyInstaller Build Script
└── README.md # This file
The application uses a priority-based configuration system, loading settings from two sources:
| Priority | Source | Location | Purpose |
|---|---|---|---|
| High | Config File | ~/.cortex/config.json |
Production/Desktop App. These settings (saved via Settings UI) override everything else. |
| Low | Environment | backend/.env |
Development/CLI. Used when running python main.py directly if config.json is missing or keys are not defined there. |
- The app checks
~/.cortex/config.json. - If a key (e.g.,
NOTION_TOKEN) is missing there, it falls back to the.envfile or system environment variables. - Result: If you change a setting in the Desktop UI, it updates
config.jsonand takes immediate effect, ignoring your.env.
All sensitive data and logs are stored in your home directory:
- Linux/macOS:
~/.cortex/config.json: Persistent settings.cortex.log: Application logs.memory.json: Long-term memory database (AI knowledge).cortex.session: Telegram session file.
Warning
Security Note: Data in ~/.cortex/ is stored unencrypted.
It is recommended to restrict permissions on this folder:
chmod 700 ~/.cortex
chmod 600 ~/.cortex/config.json"App is damaged and can't be opened" (macOS) This occurs because the release is unsigned. Fix it by running:
xattr -cr /Applications/Cortex.appLogs & Debugging
Logs are written to ~/.cortex/cortex.log.
If the app crashes immediately on start (missing logs), run the binary manually to see the output:
/Applications/Cortex.app/Contents/MacOS/cortex-agent-*"Port 8000 already in use"
- The sidecar usually cleans up after itself via a stdin watchdog (if the parent app dies, the agent dies).
- If it crashes hard, run
fuser -k 8000/tcp(Linux) orlsof -i :8000(macOS) to find and kill the process.
"ModuleNotFoundError" in Sidecar
- This usually means PyInstaller missed a file. Check
backend/build.pyand ensure the new module is added tohiddenimports.