|
2 | 2 |
|
3 | 3 | Track used electric vehicle prices across multiple sources with automated scraping and visualization. |
4 | 4 |
|
| 5 | +## Features |
| 6 | + |
| 7 | +- **Automated Daily Scraping**: GitHub Actions runs scrapers daily at midnight UTC |
| 8 | +- **Multiple Sources**: CarMax, Carvana, Platt Auto, and extensible to more |
| 9 | +- **11 EV Models Tracked**: Tesla Model 3/Y/S/X, Nissan Leaf/Ariya, Chevy Bolt EV/EUV, Ford Mustang Mach-E, Hyundai Ioniq 5, Volkswagen ID.4 |
| 10 | +- **Interactive Visualizations**: |
| 11 | + - Overview chart showing average prices across all models |
| 12 | + - Detail charts with price ranges and individual listings |
| 13 | + - Clickable labels and direct line labels |
| 14 | +- **Static Deployment**: Hosted on GitHub Pages with no backend required |
| 15 | + |
| 16 | +## Live Demo |
| 17 | + |
| 18 | +Visit the live tracker at: https://quicklywilliam.github.io/usedevpricetracker/ |
| 19 | + |
5 | 20 | ## Architecture |
6 | 21 |
|
7 | | -- **Data**: JSON files in `/data/{source}/{date}.json` |
8 | | -- **Scraping**: GitHub Actions run daily |
9 | | -- **Frontend**: Static React app on GitHub Pages |
10 | | -- **Charts**: Chart.js for price visualizations |
| 22 | +- **Data Storage**: JSON files in `/data/{source}/{date}.json` committed to repository |
| 23 | +- **Scraping**: Puppeteer-based scrapers running in GitHub Actions |
| 24 | +- **Frontend**: React app with Chart.js visualizations |
| 25 | +- **Deployment**: GitHub Pages for static hosting |
| 26 | +- **Workflows**: |
| 27 | + - `deploy.yml`: Runs on every push to main, builds and deploys frontend only |
| 28 | + - `scrape-and-deploy.yml`: Runs daily at midnight UTC, scrapes all sources, commits data, then deploys |
11 | 29 |
|
12 | 30 | ## Quick Start |
13 | 31 |
|
| 32 | +### Run Frontend Locally |
| 33 | + |
14 | 34 | ```bash |
15 | 35 | npm install |
16 | 36 | npm run dev |
17 | 37 | ``` |
18 | 38 |
|
19 | | -## Adding a New Source |
| 39 | +Visit http://localhost:5173 |
| 40 | + |
| 41 | +### Run Scrapers Locally |
| 42 | + |
| 43 | +Run all scrapers: |
| 44 | +```bash |
| 45 | +node scrapers/run-all.js |
| 46 | +``` |
| 47 | + |
| 48 | +Run a specific scraper: |
| 49 | +```bash |
| 50 | +node scrapers/carmax/scrape.js |
| 51 | +node scrapers/carvana/scrape.js |
| 52 | +node scrapers/plattauto/scrape.js |
| 53 | +``` |
| 54 | + |
| 55 | +Run mock scraper for testing: |
| 56 | +```bash |
| 57 | +node scrapers/mock-source/scrape.js |
| 58 | +``` |
| 59 | + |
| 60 | +### Generate Mock Historical Data |
| 61 | + |
| 62 | +For testing visualizations with multiple days of data: |
| 63 | +```bash |
| 64 | +node scrapers/generate-mock-history.js |
| 65 | +``` |
| 66 | + |
| 67 | +## Project Structure |
| 68 | + |
| 69 | +``` |
| 70 | +. |
| 71 | +├── .github/workflows/ |
| 72 | +│ ├── deploy.yml # Deploy-only workflow (on push) |
| 73 | +│ └── scrape-and-deploy.yml # Daily scraping workflow |
| 74 | +├── data/ # Scraped price data (JSON) |
| 75 | +│ ├── carmax/ |
| 76 | +│ ├── carvana/ |
| 77 | +│ ├── plattauto/ |
| 78 | +│ └── mock-source/ |
| 79 | +├── scrapers/ |
| 80 | +│ ├── shared/ |
| 81 | +│ │ └── scraper-utils.js # Shared scraper utilities |
| 82 | +│ ├── carmax/ |
| 83 | +│ │ └── scrape.js |
| 84 | +│ ├── carvana/ |
| 85 | +│ │ └── scrape.js |
| 86 | +│ ├── plattauto/ |
| 87 | +│ │ └── scrape.js |
| 88 | +│ ├── mock-source/ |
| 89 | +│ │ └── scrape.js |
| 90 | +│ ├── run-all.js # Run all scrapers sequentially |
| 91 | +│ └── generate-mock-history.js |
| 92 | +├── src/ |
| 93 | +│ ├── components/ |
| 94 | +│ │ ├── OverviewChart.jsx # Main overview with all models |
| 95 | +│ │ ├── DetailChart.jsx # Per-model price ranges |
| 96 | +│ │ └── ListingsTable.jsx # Individual listings table |
| 97 | +│ ├── services/ |
| 98 | +│ │ └── dataLoader.js # Load and process JSON data |
| 99 | +│ ├── utils/ |
| 100 | +│ │ └── chartLabels.js # Reusable chart label plugin |
| 101 | +│ └── App.jsx |
| 102 | +├── config/ |
| 103 | +│ └── models.json # EV models to track |
| 104 | +└── vite.config.js # Vite config with data copy plugin |
| 105 | +``` |
| 106 | + |
| 107 | +## Adding a New Scraper |
| 108 | + |
| 109 | +All scrapers use shared utilities from `scrapers/shared/scraper-utils.js` for consistency: |
| 110 | + |
| 111 | +1. **Create scraper directory**: |
| 112 | + ```bash |
| 113 | + mkdir scrapers/newsource |
| 114 | + ``` |
| 115 | + |
| 116 | +2. **Create `scrape.js`** using the template pattern: |
| 117 | + ```javascript |
| 118 | + import { setupScraper, scrapeModel } from '../shared/scraper-utils.js'; |
| 119 | + |
| 120 | + const SOURCE_NAME = 'newsource'; |
| 121 | + |
| 122 | + async function scrapeListings(page, make, model) { |
| 123 | + const url = buildSearchUrl(make, model); |
| 124 | + await page.goto(url, { waitUntil: 'networkidle0' }); |
| 125 | + |
| 126 | + return await page.evaluate(() => { |
| 127 | + const listings = []; |
| 128 | + document.querySelectorAll('.listing-card').forEach(card => { |
| 129 | + listings.push({ |
| 130 | + price: parseInt(card.querySelector('.price').textContent.replace(/\D/g, '')), |
| 131 | + year: parseInt(card.querySelector('.year').textContent), |
| 132 | + trim: card.querySelector('.trim').textContent.trim(), |
| 133 | + mileage: parseInt(card.querySelector('.mileage').textContent.replace(/\D/g, '')), |
| 134 | + url: card.querySelector('a').href |
| 135 | + }); |
| 136 | + }); |
| 137 | + return listings; |
| 138 | + }); |
| 139 | + } |
| 140 | + |
| 141 | + function buildSearchUrl(make, model) { |
| 142 | + return `https://newsource.com/cars/${make}-${model}`; |
| 143 | + } |
| 144 | + |
| 145 | + (async () => { |
| 146 | + const { browser, models } = await setupScraper(SOURCE_NAME); |
| 147 | + |
| 148 | + try { |
| 149 | + for (const { make, model } of models) { |
| 150 | + await scrapeModel(browser, SOURCE_NAME, make, model, scrapeListings); |
| 151 | + } |
| 152 | + } finally { |
| 153 | + await browser.close(); |
| 154 | + } |
| 155 | + })(); |
| 156 | + ``` |
| 157 | + |
| 158 | +3. **Add to run-all.js** in the scrapers array |
| 159 | + |
| 160 | +4. **Test locally**: |
| 161 | + ```bash |
| 162 | + node scrapers/newsource/scrape.js |
| 163 | + ``` |
| 164 | + |
| 165 | +## GitHub Actions Workflows |
| 166 | + |
| 167 | +### Deploy Workflow (`deploy.yml`) |
| 168 | + |
| 169 | +- **Triggers**: Push to main, manual workflow dispatch |
| 170 | +- **Purpose**: Fast deployment of frontend changes |
| 171 | +- **Steps**: |
| 172 | + 1. Checkout repository |
| 173 | + 2. Install dependencies |
| 174 | + 3. Build frontend (includes copying data directory) |
| 175 | + 4. Deploy to GitHub Pages |
| 176 | + |
| 177 | +### Scrape and Deploy Workflow (`scrape-and-deploy.yml`) |
| 178 | + |
| 179 | +- **Triggers**: Daily at midnight UTC, manual workflow dispatch |
| 180 | +- **Purpose**: Collect fresh price data and update site |
| 181 | +- **Steps**: |
| 182 | + 1. Checkout repository |
| 183 | + 2. Install dependencies and Chromium |
| 184 | + 3. Run all scrapers (model-first iteration) |
| 185 | + 4. Commit scraped data to repository |
| 186 | + 5. Build frontend |
| 187 | + 6. Deploy to GitHub Pages |
| 188 | + |
| 189 | +### Manual Trigger |
| 190 | + |
| 191 | +To manually run the scraping workflow: |
| 192 | +```bash |
| 193 | +gh workflow run "Scrape Prices and Deploy" |
| 194 | +``` |
| 195 | + |
| 196 | +## Data Format |
| 197 | + |
| 198 | +Each scraper outputs JSON files with this structure: |
| 199 | + |
| 200 | +```json |
| 201 | +{ |
| 202 | + "source": "carmax", |
| 203 | + "date": "2025-10-20", |
| 204 | + "listings": [ |
| 205 | + { |
| 206 | + "make": "Tesla", |
| 207 | + "model": "Model 3", |
| 208 | + "year": 2023, |
| 209 | + "trim": "Long Range", |
| 210 | + "price": 35990, |
| 211 | + "mileage": 12500, |
| 212 | + "url": "https://..." |
| 213 | + } |
| 214 | + ] |
| 215 | +} |
| 216 | +``` |
| 217 | + |
| 218 | +## Tracked Models |
20 | 219 |
|
21 | | -Use the CLI tool to add new price sources with AI assistance: |
| 220 | +Configured in `config/models.json`: |
| 221 | + |
| 222 | +- Tesla: Model 3, Model Y, Model S, Model X |
| 223 | +- Nissan: Leaf, Ariya |
| 224 | +- Chevrolet: Bolt EV, Bolt EUV |
| 225 | +- Ford: Mustang Mach-E |
| 226 | +- Hyundai: Ioniq 5 |
| 227 | +- Volkswagen: ID.4 |
| 228 | + |
| 229 | +## Development |
| 230 | + |
| 231 | +### Build for Production |
22 | 232 |
|
23 | 233 | ```bash |
24 | | -npm run add-source |
| 234 | +npm run build |
25 | 235 | ``` |
| 236 | + |
| 237 | +### Test Production Build Locally |
| 238 | + |
| 239 | +```bash |
| 240 | +# Build the app |
| 241 | +npm run build |
| 242 | + |
| 243 | +# Create symlink for correct base path |
| 244 | +ln -s dist usedevpricetracker |
| 245 | + |
| 246 | +# Serve from project root |
| 247 | +python3 -m http.server 8001 |
| 248 | + |
| 249 | +# Visit http://localhost:8001/usedevpricetracker/ |
| 250 | +``` |
| 251 | + |
| 252 | +## Configuration |
| 253 | + |
| 254 | +### Vite Base Path |
| 255 | + |
| 256 | +The app is configured for GitHub Pages deployment at `/usedevpricetracker/`: |
| 257 | + |
| 258 | +```javascript |
| 259 | +// vite.config.js |
| 260 | +export default defineConfig({ |
| 261 | + base: '/usedevpricetracker/' |
| 262 | +}); |
| 263 | +``` |
| 264 | + |
| 265 | +### Models to Track |
| 266 | + |
| 267 | +Edit `config/models.json` to add/remove models: |
| 268 | + |
| 269 | +```json |
| 270 | +[ |
| 271 | + { "make": "Tesla", "model": "Model 3" }, |
| 272 | + { "make": "Nissan", "model": "Ariya" } |
| 273 | +] |
| 274 | +``` |
| 275 | + |
| 276 | +## License |
| 277 | + |
| 278 | +MIT |
0 commit comments