A real-time security scanner for Solana tokens that monitors new Raydium liquidity pools and filters out rug pulls, honeypots, and scam tokens to find legitimate investment opportunities.
- Monitors Raydium DEX for new liquidity pools every 60 seconds
- Security Analysis - Checks for mint/freeze authorities (rug pull indicators)
- Liquidity Filtering - Only shows tokens with substantial liquidity ($5k+)
- Pump.fun Detection - Filters out pump.fun tokens
- Real-time Dashboard - Web interface to view safe tokens
- Transaction Monitoring - Verifies recent trading activity
- Python 3.7+ installed
- pip package manager
- 2GB+ free disk space (for database)
- Internet connection for API access
# Clone the repository or download the project
git clone <repository-url>
cd pythonProject
# Create virtual environment (recommended)
python3 -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install Python dependencies
pip install -r requirements.txtFor Telegram notifications, create a .env file:
TG_API_KEY=your_telegram_bot_token
TG_CHAT_ID=your_chat_id# Make scripts executable
chmod +x start_optimized.sh stop_services.sh
# Start everything with one command
./start_optimized.sh
# Access the dashboard at: http://localhost:8080
# Advanced dashboard at: http://localhost:8084
# Stop all services
./stop_services.sh# Terminal 1 - Start the scanner
python3 main.py
# Or use the optimized version:
python3 optimized_scanner.py
# Terminal 2 - Start the dashboard
python3 advanced_filter_dashboard.py
# Dashboard will be available at: http://localhost:8084# Run scanner in background
nohup python3 main.py > scanner.log 2>&1 &
# Run dashboard in background
nohup python3 advanced_filter_dashboard.py > dashboard.log 2>&1 &
# Check processes
ps aux | grep python3
# View logs
tail -f scanner.log
tail -f dashboard.logOpen your browser to:
- Basic Dashboard: http://localhost:8080
- Enhanced Dashboard: http://localhost:8082
- Advanced Filter Dashboard (Main): http://localhost:8084
The advanced dashboard provides:
- Real-time Filtering - DexTools/DexScreener competitor
- Market Cap Sorting - Estimated market caps
- Preset Configurations - 5 competitive filter presets
- Table View - Professional data presentation
- Shows tokens discovered in last 2 hours
- Minimum $500 liquidity requirement
- Direct links to Solscan and DexScreener
- Pump.fun token indicators
- High Liquidity: $10,000+ minimum
- Trading Volume: $500+ daily volume
- No Pump Tokens: Filters out pump.fun launches
- Safety Score: 1-10 rating system
- Fresh Data: Last 6 hours only
The scanner applies multiple security checks:
- β No mint authority (can't create infinite supply)
- β No freeze authority (can't freeze user funds)
- β Substantial liquidity ($10k+)
- β Active trading volume ($500+)
- β Not a pump.fun token
- β Recent trading activity
- Mint authority present
- Freeze authority present
- Low liquidity pools
- Pump.fun launches
- No recent transactions
βββ main.py # Core scanner logic
βββ dashboard.py # Web dashboard
βββ alterdb.py # Database migrations
βββ requirements.txt # Python dependencies
βββ .env # Telegram config (optional)
βββ templates/
β βββ dashboard.html # Dashboard UI
βββ raydium_pools.db # SQLite database
CHECK_INTERVAL = 60 # Scan every 60 seconds
RAYDIUM_API_ENDPOINT # Raydium pools API
SOLANA_RPC_ENDPOINT # Solana RPC for token analysis
SOLSCAN_API_ENDPOINT # Transaction verification# Recent tokens (last 2 hours, $500+ liquidity)
# Safe tokens (last 6 hours, $10k+ liquidity, $500+ volume)# The database grows large over time. To optimize:
sqlite3 raydium_pools.db "VACUUM;"
sqlite3 raydium_pools.db "REINDEX;"- Raydium API: No strict limits observed
- Solscan API: 120 requests/minute
- Solana RPC: Varies by provider
# Different scan intervals
CHECK_INTERVAL=30 python3 main.py # Scan every 30 seconds
# Different database
DATABASE_FILE=fast_scan.db python3 main.pyEdit dashboard.py to adjust filtering criteria:
# More conservative (higher liquidity)
AND liquidity > 50000
AND volume24h > 2000
# More aggressive (lower requirements)
AND liquidity > 1000
AND volume24h > 1002025-09-20 06:11:58 - INFO - Current pools fetched: 701535
2025-09-20 06:12:03 - INFO - Found 5 new untradable pools
2025-09-20 06:12:05 - INFO - Safe token found: TokenName (address)
-- Check recent discoveries
SELECT COUNT(*) FROM pools WHERE discovered_at > datetime('now', '-1 hour');
-- View safe tokens
SELECT name, liquidity, volume24h FROM pools
WHERE liquidity > 10000 AND is_pump_token = 0
ORDER BY discovered_at DESC LIMIT 10;- This tool provides analysis, not investment advice
- Always DYOR (Do Your Own Research)
- Check contract source code when available
- Verify team/project legitimacy
- Start with small amounts
- Requires internet connection for real-time data
- Raydium API availability affects discovery speed
- Solscan API needed for transaction verification
- Database grows ~1MB per day
- Contains 700k+ historical pool records
- Consider periodic cleanup for old records
"Database is locked"
# Kill all scanner processes
pkill -f "python3 main.py"
# Restart scanner
python3 main.py"Port already in use"
# Change dashboard port
# Edit dashboard.py: app.run(port=8081)"No recent tokens found"
- Wait 5-10 minutes for new pool discoveries
- Check scanner logs for API errors
- Verify internet connection
# Check system resources
top -p $(pgrep -f "python3 main.py")
# Database optimization
sqlite3 raydium_pools.db "ANALYZE;"Requirements:
- 2 vCPUs, 4GB RAM minimum
- Ubuntu 20.04+ or similar
- 20GB+ storage for database growth
Setup on VPS:
# Update system
sudo apt update && sudo apt upgrade -y
# Install Python and dependencies
sudo apt install python3-pip python3-venv nginx supervisor -y
# Clone and setup
git clone <repository-url> /opt/solana-scanner
cd /opt/solana-scanner/pythonProject
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
# Configure Supervisor for auto-restart
sudo nano /etc/supervisor/conf.d/scanner.confSupervisor Config Example:
[program:scanner]
command=/opt/solana-scanner/pythonProject/venv/bin/python main.py
directory=/opt/solana-scanner/pythonProject
autostart=true
autorestart=true
user=ubuntu
[program:dashboard]
command=/opt/solana-scanner/pythonProject/venv/bin/python advanced_filter_dashboard.py
directory=/opt/solana-scanner/pythonProject
autostart=true
autorestart=true
user=ubuntuDockerfile:
FROM python:3.9-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
CMD ["python", "main.py"]docker-compose.yml:
version: '3.8'
services:
scanner:
build: .
command: python main.py
volumes:
- ./raydium_pools.db:/app/raydium_pools.db
restart: unless-stopped
dashboard:
build: .
command: python advanced_filter_dashboard.py
ports:
- "8084:8084"
volumes:
- ./raydium_pools.db:/app/raydium_pools.db
restart: unless-stoppedHeroku (Free Tier Available):
# Create Procfile
echo "web: python advanced_filter_dashboard.py" > Procfile
echo "worker: python main.py" >> Procfile
# Deploy
heroku create your-scanner-app
git push heroku main
heroku ps:scale web=1 worker=1AWS EC2 / Google Cloud / DigitalOcean:
- Use t2.medium / e2-medium / 4GB droplet minimum
- Setup with systemd services for auto-restart
- Configure security groups for port 8084
# Use environment variables for sensitive data
export SOLANA_RPC_ENDPOINT="your-private-rpc"
export TG_API_KEY="your-telegram-key"
# Restrict dashboard access with nginx
server {
listen 80;
location / {
proxy_pass http://localhost:8084;
auth_basic "Restricted Access";
auth_basic_user_file /etc/nginx/.htpasswd;
}
}# Database maintenance cron job
0 2 * * * sqlite3 /path/to/raydium_pools.db "VACUUM; ANALYZE;"
# Log rotation
/var/log/scanner/*.log {
daily
rotate 7
compress
missingok
}# Health check endpoint
curl http://localhost:8084/api/stats
# Process monitoring
pip install prometheus-flask-exporter
# Telegram alerts for crashes
./start_with_monitoring.shβββββββββββββββββββ
β Nginx/Caddy β β HTTPS, Rate limiting
ββββββββββ¬βββββββββ
β
ββββββββββΌβββββββββ
β Flask Dashboardβ β Port 8084
ββββββββββ¬βββββββββ
β
ββββββββββΌβββββββββ
β SQLite Databaseβ β Consider PostgreSQL for scale
ββββββββββ¬βββββββββ
β
ββββββββββΌβββββββββ
β Scanner Processβ β Multiple workers possible
βββββββββββββββββββ
- Database: Migrate to PostgreSQL for better concurrency
- Scanner: Run multiple scanner instances with different intervals
- Dashboard: Add Redis caching for frequent queries
- API: Implement rate limiting and API keys
- Storage: Archive old records to S3/GCS
For issues or improvements:
- Check the troubleshooting section
- Review scanner logs for errors
- Verify API endpoints are accessible
- Consider adjusting scan intervals
A well-configured scanner should:
- Discover 5-20 new tokens per hour
- Find 1-5 "safe" tokens per day
- Maintain <5% false positives
- Complete scans in under 30 seconds
Happy safe token hunting! π