A simple dashboard that tracks LNG vessel departures from the LNG Canada facility in Kitimat, BC using data stored in AWS S3.
This application uses a serverless, S3-based architecture:
- Data Storage: CSV file in AWS S3 bucket (fossil-fuel-shipments-tracker/lng-shipments.csv)
- Scraper: Node.js script that scrapes VesselFinder.com and updates S3
- Frontend: Static React app that reads directly from S3
No backend server or database required!
- Node.js 18+ installed
- AWS account with S3 access
- AWS credentials configured
Create an S3 bucket named fossil-fuel-shipments-tracker in your AWS account and set up CORS configuration to allow the frontend to read the CSV.
Set the bucket to allow public read access for the lng-shipments.csv file or configure AWS credentials in the frontend.
Copy .env.example to .env and fill in your AWS credentials
npm run install-allThis installs dependencies for both the scraper and the React frontend.
To scrape VesselFinder for departed vessels and update S3:
npm run scrapeThis will:
- Scrape the VesselFinder port page for Kitimat (CAKTM001)
- Extract departure information for LNG tankers
- Fetch vessel details (IMO, MMSI, destination, ETA)
- Update the CSV file in S3 (avoiding duplicates)
To start the React dashboard locally:
npm startThe dashboard will open at http://localhost:3000 and display:
- Total shipments tracked
- Total LNG capacity shipped
- Searchable table of all departures with vessel details
The dashboard automatically refreshes data from S3 every 5 minutes.
To create a production build of the React app:
npm run buildThe build files will be in client/build/ and can be deployed to any static hosting service.
The S3 CSV file has these columns: vessel_name, imo_number, mmsi, capacity_cbm, CER_reported_payload, departure_date, destination_port, destination_country, estimated_arrival, notes
MIT