AI-powered search results summary generator that works transparently with existing search engine instances (SearXNG and 4get) using OpenAI-compatible AI providers.
demo.mp4
-
Create a configuration directory:
mkdir -p ./search-results-summarizer/config cd ./search-results-summarizer -
Run the container:
docker run -d --name search-results-summarizer \ -p 3000:3000 \ -v "./config:/config" \ --restart unless-stopped \ ghcr.io/chandujr/search-results-summarizer:latest -
Configure the service:
# Edit the configuration files nano ./config/config.yaml nano ./config/.env -
Restart the container to apply changes:
docker restart search-results-summarizer
-
Create a project directory and docker-compose.yml:
mkdir -p ./search-results-summarizer/config cd ./search-results-summarizer cat > docker-compose.yml << EOF services: search-results-summarizer: image: ghcr.io/chandujr/search-results-summarizer:latest container_name: search-results-summarizer ports: - "3000:3000" volumes: - ./config:/config restart: unless-stopped EOF
-
Start the service:
docker-compose up -d
-
Configure the service:
# Edit the configuration files nano ./config/config.yaml nano ./config/.env -
Restart the service to apply changes:
docker-compose restart
-
Clone the repository:
git clone https://github.com/chandujr/search-results-summarizer.git cd search-results-summarizer -
Copy the configuration templates:
cp config/config.yaml.default config/config.yaml cp config/.env.default config/.env
-
Edit your configuration:
nano config/config.yaml nano config/.env
-
Build and run with Docker Compose:
docker-compose up -d --build
In config/.env, set your API key based on your provider:
# For OpenRouter
OPENROUTER_API_KEY=your_api_key_here
In config/config.yaml, configure:
ENGINE_NAME: "searxng" or "4get"ENGINE_URL: URL of your search engine instance (see networking note below)- Works with both locally installed and public instances
SUMMARIZER_LLM_URL: URL of the AI provider to use for summarizationSUMMARIZER_MODEL_ID: AI model to use for summarizationCLASSIFIER_LLM_URL: URL of the AI provider to use for deciding whether to show summary insmartmodeCLASSIFIER_MODEL_ID: AI model to use for deciding whether to show summary insmartmode- Should be a model that supports function/tool calling
MAX_TOKENS: Maximum tokens for AI responses (default: 750)
SUMMARY_MODE: "auto" (automatic), "manual" (button-triggered), or "smart" (AI decides when to summarize)MAX_RESULTS_FOR_SUMMARY: Number of results to summarize (default: 7)
MODIFY_CSP_HEADERS: Set totrueif using public search engine instances that block external scripts (default:false)TRUST_PROXY: Set totruewhen running behind a reverse proxy - needed for proper rate limiting (default:false)PROXY_IP_RANGE: IP range of trusted proxy when TRUST_PROXY is enabled (default: "10.0.0.0/8" for Render)RATE_LIMIT_MS: Rate limit in milliseconds between requests (default: 1000)
MIN_KEYWORD_COUNT: Minimum number of keywords required (default: 3)MIN_RESULT_COUNT: Minimum search results required (default: 3)EXCLUDE_WORDS: Words that prevent summarizationEXCLUDE_OVERRIDES: Words that override the exclude list and force summarization
Note: The "smart" mode bypasses these filters and uses AI to determine if summarization is needed.
You may not be able to use localhost for URLs since from within a Docker container, localhost refers to the container itself, not the host machine.
To connect to your installed services (search engine or Ollama) running on your host machine:
-
Use the Docker network gateway IP:
# Find your Docker network name docker network ls | grep search-results-summarizer # Inspect the network to find the gateway IP docker network inspect <network_name> | grep Gateway # Example result: "Gateway": "172.19.0.1" # Set ENGINE_URL to: http://172.19.0.1:8081 (for search engine)
-
Use your host machine's IP address:
# Find your host IP ip route get 1.1.1.1 | awk '{print $7}' # Example result: 192.168.1.100 # Set ENGINE_URL to: http://192.168.1.100:8081 (for search engine)
- Access the service at
http://localhost:3000(or your configured port) to verify it's working - Visit the service once, then you can set it as your default search engine through your browser's settings
Most modern browsers allow you to add this as a search engine.
- Summaries are generated by sending search query and results to your selected AI provider
- No data is stored by this proxy
- Consider privacy implications before use
AGPL-3.0