Transform weeks of manual competitive research into minutes of AI-powered analysis using autonomous agents and enterprise-grade web scraping.
-
π€ Multi-Agent Workflow: Three specialized AI agents working in harmony
- π Researcher Agent: Data collection with Bright Data web scraping
- π Analyst Agent: Strategic SWOT analysis and threat assessment
- π Writer Agent: Executive-ready competitive intelligence reports
-
π Real-Time Streaming: Live progress updates and tool call monitoring
-
β‘ Enterprise-Grade: Built with FastAPI, React, and TypeScript
-
π― Comprehensive Analysis: Pricing, leadership, market position, and strategy
-
π± Beautiful UI: Vercel-inspired design with responsive layout
-
π§ Production Ready: Docker support and scalable architecture
- Python 3.10+
- Node.js 18+
- Gemini API Key
- Bright Data API Key
git clone https://github.com/brightdata/competitive-intelligence.git
cd competitive-intelligence# Install Python dependencies
cd api && pip install -r requirements.txt
# Set environment variables
export GEMINI_API_KEY="your_gemini_api_key"
export BRIGHTDATA_API_KEY="your_brightdata_api_key"
# Start the API server
python app.pyThe API will be available at http://localhost:8000
# Navigate to frontend directory
cd ci-agent-ui
# Install dependencies
npm install
# Start development server
npm run devThe frontend will be available at http://localhost:5173
- Open the frontend in your browser
- Select a demo scenario (Slack, Notion, Figma) or enter a custom company
- Click "Start Analysis" and watch the AI agents work in real-time
- Get comprehensive competitive intelligence in minutes!
- Architecture
- Installation
- API Documentation
- Frontend Guide
- Configuration
- Docker Deployment
- Examples
- Contributing
- License
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β π Researcher βββββΆβ π Analyst βββββΆβ π Writer β
β Agent β β Agent β β Agent β
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β β β
βΌ βΌ βΌ
Data Collection Strategic Analysis Report Generation
- Web scraping - SWOT analysis - Executive summary
- Market research - Threat assessment - Recommendations
- Company intel - Competitive position - Action items
β²
β
βββββββββββββββββββ
β π FastAPI β
β Backend β
βββββββββββββββββββ
β²
β
βββββββββββββββββββ
β βοΈ React β
β Frontend β
βββββββββββββββββββ
Backend:
- FastAPI: Modern, fast web framework for building APIs
- Strands Agents: Autonomous AI agent framework
- Bright Data: Enterprise web scraping and data collection
- Google Gemini 2.0: Advanced language model for analysis
- LiteLLM: Unified interface for multiple AI models
Frontend:
- React 18: Modern UI library with hooks
- TypeScript: Type-safe JavaScript development
- Vite: Fast build tool and development server
- Tailwind CSS: Utility-first CSS framework
- shadcn/ui: Beautiful, accessible component library
- Clone and setup backend:
git clone https://github.com/brightdata/competitive-intelligence.git
cd competitive-intelligence
# Create virtual environment
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# Install dependencies
pip install -r requirements-api.txt- Setup frontend:
cd ci-agent-ui
npm install- Environment configuration:
# Copy example environment file
cp .env.example .env
# Edit .env with your API keys
GEMINI_API_KEY=your_gemini_api_key_here
BRIGHTDATA_API_KEY=your_brightdata_api_key_here- Start development servers:
# Terminal 1: Backend
python app.py
# Terminal 2: Frontend
cd ci-agent-ui && npm run devSee our Docker Deployment section for production deployment instructions.
http://localhost:8000
GET /healthPOST /analyze/stream
Content-Type: application/json
{
"competitor_name": "Slack",
"competitor_website": "https://slack.com",
"stream": true
}GET /demo-scenariosGET /sessions
GET /sessions/{session_id}{
"competitor": "Slack",
"website": "https://slack.com",
"research_findings": "Comprehensive research data...",
"strategic_analysis": "SWOT and competitive analysis...",
"final_report": "Executive summary and recommendations...",
"timestamp": "2025-09-17T10:30:00Z",
"status": "success"
}- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
- OpenAPI Schema: http://localhost:8000/openapi.json
CompetitiveIntelligenceForm: Main analysis interface with real-time streamingDemoScenarios: Pre-configured company examplesHeader: Navigation and brandingProgressTracker: Live agent workflow visualization
npm run dev # Start development server
npm run build # Build for production
npm run preview # Preview production build
npm run lint # Run ESLintStyling: Modify tailwind.config.js for theme customization
Components: Add new shadcn/ui components with npx shadcn@latest add [component]
API Endpoint: Update API_BASE_URL in component files
| Variable | Description | Required | Default |
|---|---|---|---|
GEMINI_API_KEY |
Google AI Studio API key | Yes | - |
BRIGHTDATA_API_KEY |
Bright Data API key | Yes | - |
GEMINI_MODEL_NAME |
Gemini model version | No | gemini-2.0-flash |
API_HOST |
API server host | No | 0.0.0.0 |
API_PORT |
API server port | No | 8000 |
LOG_LEVEL |
Logging level | No | info |
Each agent can be customized by modifying their system prompts in api/ci_agent.py:
- Researcher Agent: Data collection and web scraping behavior
- Analyst Agent: Analysis depth and strategic focus
- Writer Agent: Report structure and formatting
# Start all services
docker-compose up -d
# View logs
docker-compose logs -f
# Stop services
docker-compose downBackend:
cd api
docker build -t ci-backend .
docker run -p 8000:8000 \
-e GEMINI_API_KEY=your_key \
-e BRIGHTDATA_API_KEY=your_key \
ci-backendFrontend:
cd ci-agent-ui
docker build -t ci-frontend .
docker run -p 3000:80 ci-frontend- Use environment-specific configuration files
- Implement proper logging and monitoring
- Set up SSL/TLS certificates
- Configure rate limiting and security headers
- Use a reverse proxy (nginx/Cloudflare)
import requests
# Start streaming analysis
response = requests.post(
"http://localhost:8000/analyze/stream",
json={
"competitor_name": "Slack",
"competitor_website": "https://slack.com",
"stream": True
},
stream=True
)
for line in response.iter_lines(decode_unicode=True):
if line.startswith("data: "):
event = json.loads(line[6:])
print(f"Event: {event['type']}")
if event['type'] == 'complete':
print("Analysis completed!")
breakconst response = await fetch('/analyze/stream', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
competitor_name: 'Slack',
stream: true
})
});
const reader = response.body.getReader();
const decoder = new TextDecoder();
while (true) {
const { value, done } = await reader.read();
if (done) break;
const chunk = decoder.decode(value);
// Process streaming events
}# Run interactive demo
python api/ci_agent.py
# Analyze specific competitor
python -c "
from api.ci_agent import MultiAgentCompetitiveIntelligence
ci = MultiAgentCompetitiveIntelligence()
result = ci.run_competitive_intelligence_workflow('Slack')
print(result['final_report'])
"cd api
pip install pytest httpx
pytest tests/cd ci-agent-ui
npm test
npm run test:coverage# Start services
docker-compose up -d
# Run end-to-end tests
npm run test:e2eWe welcome contributions! Please see our Contributing Guide for details.
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Make your changes and add tests
- Commit with conventional commits:
git commit -m "feat: add amazing feature" - Push to your fork:
git push origin feature/amazing-feature - Open a Pull Request
- Python: Follow PEP 8, use Black formatter
- TypeScript: Follow ESLint configuration
- Commits: Use Conventional Commits
See CHANGELOG.md for a detailed history of changes.
- Bug Reports: GitHub Issues
- Feature Requests: GitHub Discussions
- Documentation: Wiki
This project is licensed under the MIT License - see the LICENSE file for details.
- Strands: Autonomous AI agent framework
- Bright Data: Enterprise web scraping platform
- Google Gemini: Advanced language model
- FastAPI: Modern Python web framework
- shadcn/ui: Beautiful React components
- Blog Post: Technical Deep Dive
Built with β€οΈ by Bright Data
β Star this repo if you find it useful!