This code can be used to download solar forecasts and save them to a PostgreSQL database. It fetches solar generation estimates for embedded solar farms and processes the data for analysis. We currently collect
- UK: Forecast can be retreived from NESO. Generation Data can be retrevied from PVLive.
- NL: Generation values from Ned NL, both national and region. National Forecast values from Ned NL too.
- DE: Generation values from ENTSOE for several TSOs.
- BE: Solar PV forecast data (national and regional) from Elia Open Data API.
- India (Rajasthan): Real-time solar and wind generation data from RUVNL (Rajasthan Urja Vikas Nigam Limited).
Here are the different sources of data, and which methods can be used to save the results
| Source | Country | CSV | Data Platform | DB (Legacy) | Site DB (Legacy) |
|---|---|---|---|---|---|
| PVLive | ๐ฌ๐ง | โ | โ | ||
| NESO forecast | ๐ฌ๐ง | โ | โ | ||
| Ned-nl | ๐ณ๐ฑ | โ | โ | ||
| Ned-nl forecast | ๐ณ๐ฑ | โ | โ | ||
| Germany (ENTSOE) | ๐ฉ๐ช | โ | โ | ||
| Elia Open Data | ๐ง๐ช | โ | |||
| RUVNL (Rajasthan SLDC) | ๐ฎ๐ณ | โ |
- Docker
- Docker Compose
- Clone the repository:
git clone https://github.com/openclimatefix/neso-solar-consumer.git
cd neso-solar-consumer- Copy the example environment file:
cp .example.env .env- Start the application:
docker compose up -dThe above command will:
- Start a PostgreSQL database container
- Build and start the NESO Solar Consumer application
- Configure all necessary networking between containers
To stop the application:
docker compose downTo view logs:
docker compose logs -fNote: The PostgreSQL data is persisted in a Docker volume. To completely reset the database, use:
docker compose down -v
The package provides three main functionalities:
- Data Fetching: Retrieves solar forecast data from the NESO API
- Data Formatting: Processes the data into standardized forecast objects
- Data Storage: Saves the formatted forecasts to a PostgreSQL database
fetch_data.py: Handles API data retrievalformat_forecast.py: Converts raw data into forecast objectssave_forecast.py: Manages database operationsapp.py: Orchestrates the entire pipeline
DB_URL=postgresql://postgres:postgres@localhost:5432/neso_solar: Database ConfigurationCOUNTRY="gb": Country code for fetching data. Currently, other options are ["be", "ind_rajasthan", "nl"]SAVE_METHOD="db": Ways to store the data. Currently other options are ["csv", "site-db"]CSV_DIR=None: Directory to save CSV files ifSAVE_METHOD="csv".UK_PVLIVE_REGIME=in-day: For UK PVLive, the regime. Can be "in-day" or "day-after"UK_PVLIVE_N_GSPS=342: For UK PVLive, the amount of gsps we pull data for.UK_PVLIVE_BACKFILL_HOURS=2: For UK PVLive, the amount of backfill hours we pull, when regime="in-day"
This guide explains how to add a new country data source to Solar Consumer.
Adding a country typically involves:
- Identifying a reliable data source or API
- Implementing a country-specific fetch function
- Adding tests
- Saving data locally (CSV) and/or to the data platform
Identify a reliable data source for the country:
- Prefer official grid operators or government-backed APIs
- Ensure timestamps, units, and generation values are clearly defined
If the API requires credentials:
- Add the variable to
.example.env - Document the required environment variable name
Add a new country-specific fetch module inside the solar_consumer package.
Example naming convention:
solar_consumer/data/fetch_<country>.py
After adding the fetch function:
- Register the country in the main fetch dispatcher
- Add unit and integration tests under
tests/ - Verify the data runs locally and can be saved to CSV
- If supported, ensure data can be saved to the data platform
- Open a pull request for review
If the country supports saving data to the data platform:
- Clone the data platform repository:
git clone https://github.com/openclimatefix/data-platform.git
- Follow the data-platform README to start it locally (Docker-based setup).
- Configure Solar Consumer to point to the local data-platform instance.
- Run the consumer and verify data is ingested successfully.
- Set up the development environment:
pip install ".[dev]"- Run tests:
pytest- Format code:
black .- Run linter:
ruff check .The test suite includes unit tests and integration tests:
# Run all tests
pytest
# Run specific test file
pytest tests/test_fetch_data.py
# Run with coverage
pytest --cov=neso_solar_consumerThis reposistory has 2 main CI workflows - branch-ci and merged-ci.
branch-ciis triggered on all pushes to any branch exceptmain, and on any pull request that is opened, reopened or updated. It runs the tests suite, lints the project, and builds and pushes a dev image.merged-ciis triggered on any pull request merged intomain. It bumps the git tag, and builds and pushes a container with that tag.
Q: What format is the data stored in? A: The data is stored in PostgreSQL using SQLAlchemy models, with timestamps in UTC and power values in megawatts.
Q: How often should I run the consumer? A: This depends on your use case and the NESO API update frequency. The consumer can be scheduled using cron jobs or other scheduling tools.
This project is licensed under the MIT License - see the LICENSE file for details.
- PR's are welcome! See the Organisation Profile for details on contributing
- Find out about our other projects in the OCF Meta Repo
- Check out the OCF blog for updates
- Follow OCF on LinkedIn
Part of the Open Climate Fix community.
