Automated daily job search system that finds finance roles at startups, generates personalized cover letters, and sends push notifications.
- Daily job scraping from LinkedIn, Wellfound (AngelList), and Adzuna
- AI-powered matching using Claude to score job fit (1-10)
- Auto-generated cover letters personalized to each role
- Hiring manager research with LinkedIn search links
- Google Drive storage - organized folders per company
- Push notifications via Pushover when matches are found
cd ~/Desktop/job-search-bot
# Create virtual environment
python3 -m venv venv
source venv/bin/activate
# Install dependencies
pip install -r requirements.txt
# Install Playwright browser
playwright install chromiumcp .env.example .envEdit .env with your API keys (see Setup Guide below).
python -m src.main- Go to console.anthropic.com
- Sign up / Log in
- Go to API Keys → Create Key
- Copy to
.envasANTHROPIC_API_KEY
- Go to Google Cloud Console
- Create a new project (e.g., "Job Search Bot")
- Enable the Google Drive API:
- APIs & Services → Library → Search "Drive" → Enable
- Create a Service Account:
- APIs & Services → Credentials → Create Credentials → Service Account
- Name: "job-search-bot"
- Grant "Editor" role
- Create key for service account:
- Click the service account → Keys → Add Key → JSON
- Download and save as
credentials.jsonin project folder
- Create a Google Drive folder for outputs
- Share the folder with the service account email (ends in
@...iam.gserviceaccount.com) - Copy the folder ID from the URL:
drive.google.com/drive/folders/FOLDER_ID_HERE - Add to
.env:GOOGLE_CREDENTIALS_PATH=./credentials.json GOOGLE_DRIVE_FOLDER_ID=your-folder-id
- Download Pushover app on your phone (iOS / Android)
- Create account at pushover.net
- Copy your User Key from the dashboard
- Create an Application:
- Create New Application
- Name: "Job Search Bot"
- Copy the API Token
- Add to
.env:PUSHOVER_USER_KEY=your-user-key PUSHOVER_API_TOKEN=your-api-token
- Sign up at developer.adzuna.com
- Create an application
- Add to
.env:ADZUNA_APP_ID=your-app-id ADZUNA_APP_KEY=your-app-key
cd ~/Desktop/job-search-bot
git init
git add .
git commit -m "Initial commit"
gh repo create job-search-bot --private --push- Go to railway.app and sign in with GitHub
- Click "New Project" → "Deploy from GitHub repo"
- Select your
job-search-botrepository - Railway will auto-detect the config
In Railway dashboard → Variables, add all your .env values:
ANTHROPIC_API_KEYGOOGLE_DRIVE_FOLDER_IDPUSHOVER_USER_KEYPUSHOVER_API_TOKENADZUNA_APP_ID(optional)ADZUNA_APP_KEY(optional)
For Google credentials, create a variable GOOGLE_CREDENTIALS_JSON with the JSON content.
The railway.toml is already configured to run daily at 8 AM EST.
To change the schedule, edit railway.toml:
[deploy.cron]
schedule = "0 13 * * *" # 13:00 UTC = 8 AM EST| Service | Cost |
|---|---|
| Railway | ~$5/month |
| Claude API | ~$10-20/month |
| Google Drive | Free |
| Pushover | $5 one-time |
| Total | ~$15-25/month |
Edit src/config.py to change job titles, locations, salary requirements, etc.
Edit src/config.py to update the candidate profile used for cover letter generation.
- LinkedIn has anti-scraping measures
- Try running with
headless=Falseto debug
- Verify the service account email has access to the folder
- Check that
credentials.jsonpath is correct - Ensure Drive API is enabled in Google Cloud
- Verify Pushover app is installed and logged in
- Test with:
curl -s -F "token=YOUR_TOKEN" -F "user=YOUR_KEY" -F "message=test" https://api.pushover.net/1/messages.json
job-search-bot/
├── src/
│ ├── main.py # Main orchestrator
│ ├── config.py # Configuration
│ ├── scrapers/ # Job source scrapers
│ ├── processors/ # AI processing
│ ├── services/ # External services
│ └── database/ # Job tracking
├── data/
│ └── resume.pdf # Resume
├── requirements.txt
├── .env.example
└── railway.toml
Private - for personal use only.