Skip to content

Commit d42428c

Browse files
author
Micha
authored
Merge branch 'main' into regret-matrix-draft
2 parents db23995 + f26bcc5 commit d42428c

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

61 files changed

+2144
-1031
lines changed

.github/workflows/codeql.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,7 @@ jobs:
5555
# your codebase is analyzed, see https://docs.github.com/en/code-security/code-scanning/creating-an-advanced-setup-for-code-scanning/codeql-code-scanning-for-compiled-languages
5656
steps:
5757
- name: Checkout repository
58-
uses: actions/checkout@v4
58+
uses: actions/checkout@v5
5959

6060
# Initializes the CodeQL tools for scanning.
6161
- name: Initialize CodeQL

.github/workflows/push-images.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ jobs:
1616
name: dev-env
1717
runs-on: ubuntu-latest
1818
steps:
19-
- uses: actions/checkout@v4
19+
- uses: actions/checkout@v5
2020
with:
2121
fetch-depth: 0
2222

.github/workflows/test.yaml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -35,7 +35,7 @@ jobs:
3535
shell: bash -l {0}
3636

3737
steps:
38-
- uses: actions/checkout@v4
38+
- uses: actions/checkout@v5
3939

4040
- name: Setup env file path (ubuntu)
4141
if: matrix.os == 'ubuntu'
@@ -53,8 +53,8 @@ jobs:
5353
5454
- name: Use base env file if it was changed
5555
run: |
56-
git fetch origin master
57-
if git diff --name-only origin/master | grep '${{ env.BASE_ENV }}'; then
56+
git fetch origin ${{ github.event.repository.default_branch }}
57+
if git diff --name-only origin/${{ github.event.repository.default_branch }} | grep '${{ env.BASE_ENV }}'; then
5858
echo "Base env ${{ env.BASE_ENV }} changed. Using it instead of locked envs."
5959
echo "env_file=${{ env.BASE_ENV }}" >> $GITHUB_ENV
6060
else

.github/workflows/update-pinned-env.yaml

Lines changed: 3 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ jobs:
1919
run:
2020
shell: bash -l {0}
2121
steps:
22-
- uses: actions/checkout@v4
22+
- uses: actions/checkout@v5
2323

2424
- name: Setup conda
2525
uses: conda-incubator/setup-miniconda@v3
@@ -43,15 +43,6 @@ jobs:
4343
for file in envs/*.lock.yml; do
4444
mv "$file" "${file%.yml}.yaml"
4545
done
46-
- name: Add SPDX headers to lock files
47-
run: |
48-
SPDX_HEADER="# SPDX-FileCopyrightText: Contributors to PyPSA-Eur <https://github.com/pypsa/pypsa-eur>\n# SPDX-License-Identifier: CC0-1.0\n"
49-
50-
# Add header to all generated lock files
51-
for file in envs/*.lock.yaml; do
52-
echo "Adding header to $file"
53-
echo -e "$SPDX_HEADER" | cat - "$file" > temp && mv temp "$file"
54-
done
5546
5647
- name: Insert environment name in lock files
5748
run: |
@@ -78,9 +69,9 @@ jobs:
7869
needs: update-locked-environment
7970
runs-on: ubuntu-latest
8071
steps:
81-
- uses: actions/checkout@v4
72+
- uses: actions/checkout@v5
8273
- name: Download all artifacts
83-
uses: actions/download-artifact@v4
74+
uses: actions/download-artifact@v5
8475
with:
8576
name: lockfiles
8677
path: envs/

CHANGELOG.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,7 @@
44
- Added an option to source mobility demand from UBA MWMS (Projektionsbericht 2025) for the years 2025-2035
55
- Renamed functions and script for exogenous mobility demand
66
- Improved the transport demand data, added an option to source 2020 and 2025 data from AGEB instead of Aladin
7+
- Added a helper function to change the weather_year to build_scenario
78
- Longer lifetime (40 years) is only applied to existing gas CHPs, not new ones. Added a new config entry `existing_capacities:fill_value_gas_chp_lifetime`
89
- Bugfix: gas CHPs are extendable again
910
- Simplified scenarion definition and made `Mix` the default scenario

Makefile

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -82,7 +82,7 @@ test:
8282
echo "Build scenarios..."
8383
snakemake -call build_scenarios
8484
echo "Run DACH config..."
85-
snakemake -call ariadne_all --configfile=config/test/config.dach.yaml
85+
snakemake -call ariadne_all --until export_ariadne_variables --configfile=config/test/config.dach.yaml
8686
echo "All tests completed successfully."
8787

8888
unit-test:
@@ -95,6 +95,7 @@ clean-tests:
9595
snakemake -call --configfile config/test/config.myopic.yaml --delete-all-output
9696
snakemake -call make_summary_perfect --configfile config/test/config.perfect.yaml --delete-all-output
9797
snakemake -call --configfile config/test/config.scenarios.yaml -n --delete-all-output
98+
snakemake -call plot_power_networks_clustered --configfile config/test/config.tyndp.yaml --delete-all-output
9899

99100
# Removes all created files except for large cutout files (similar to fresh clone)
100101
reset:

README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -15,13 +15,13 @@ This repository contains the entire scientific project, including data sources a
1515
You need `conda` or `mamba` to run the analysis. Using conda, you can create an environment from within which you can run the analysis:
1616

1717
```
18-
conda env create -f envs/{os}-pinned.yaml
18+
conda env create -f envs/{os}.lock.yaml
1919
```
2020

2121
Where `{os}` should be replaced with your operating system, e.g. for linux the command would be:
2222

2323
```
24-
conda env create -f envs/linux-pinned.yaml
24+
conda env create -f envs/linux-64.lock.yaml
2525
```
2626

2727
## Run the analysis
@@ -105,7 +105,7 @@ SPDX-License-Identifier: CC-BY-4.0
105105
![Size](https://img.shields.io/github/repo-size/pypsa/pypsa-eur)
106106
[![Zenodo PyPSA-Eur](https://zenodo.org/badge/DOI/10.5281/zenodo.3520874.svg)](https://doi.org/10.5281/zenodo.3520874)
107107
[![Zenodo PyPSA-Eur-Sec](https://zenodo.org/badge/DOI/10.5281/zenodo.3938042.svg)](https://doi.org/10.5281/zenodo.3938042)
108-
[![Snakemake](https://img.shields.io/badge/snakemake-≥8.14.0-brightgreen.svg?style=flat)](https://snakemake.readthedocs.io)
108+
[![Snakemake](https://img.shields.io/badge/snakemake-≥9-brightgreen.svg?style=flat)](https://snakemake.readthedocs.io)
109109
[![Discord](https://img.shields.io/discord/911692131440148490?logo=discord)](https://discord.gg/AnuJBk23FU)
110110
[![REUSE status](https://api.reuse.software/badge/github.com/pypsa/pypsa-eur)](https://api.reuse.software/info/github.com/pypsa/pypsa-eur)
111111

REUSE.toml

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
version = 1
2+
SPDX-PackageName = "PyPSA-Eur"
3+
SPDX-PackageSupplier = "Tom Brown <[email protected]>"
4+
SPDX-PackageDownloadLocation = "https://github.com/pypsa/pypsa-eur"
5+
6+
[[annotations]]
7+
path = [
8+
"data/**",
9+
"doc/configtables/*",
10+
"doc/data.csv",
11+
"doc/img/*",
12+
"test/test_data/*",
13+
]
14+
SPDX-FileCopyrightText = "The PyPSA-Eur Authors"
15+
SPDX-License-Identifier = "CC-BY-4.0"
16+
17+
[[annotations]]
18+
path = [
19+
".github/**",
20+
"borg-it",
21+
"envs/*.lock.yaml",
22+
"matplotlibrc",
23+
]
24+
SPDX-FileCopyrightText = "The PyPSA-Eur Authors"
25+
SPDX-License-Identifier = "CC0-1.0"

Snakefile

Lines changed: 13 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -82,6 +82,7 @@ if config["foresight"] == "perfect":
8282
rule all:
8383
input:
8484
expand(RESULTS + "graphs/costs.svg", run=config["run"]["name"]),
85+
expand(resources("maps/power-network.pdf"), run=config["run"]["name"]),
8586
expand(
8687
resources("maps/power-network-s-{clusters}.pdf"),
8788
run=config["run"]["name"],
@@ -130,21 +131,17 @@ rule all:
130131
run=config["run"]["name"],
131132
carrier=config_provider("plotting", "balance_map", "bus_carriers")(w),
132133
),
133-
directory(
134-
expand(
135-
RESULTS
136-
+ "graphics/balance_timeseries/s_{clusters}_{opts}_{sector_opts}_{planning_horizons}",
137-
run=config["run"]["name"],
138-
**config["scenario"],
139-
),
134+
expand(
135+
RESULTS
136+
+ "graphics/balance_timeseries/s_{clusters}_{opts}_{sector_opts}_{planning_horizons}",
137+
run=config["run"]["name"],
138+
**config["scenario"],
140139
),
141-
directory(
142-
expand(
143-
RESULTS
144-
+ "graphics/heatmap_timeseries/s_{clusters}_{opts}_{sector_opts}_{planning_horizons}",
145-
run=config["run"]["name"],
146-
**config["scenario"],
147-
),
140+
expand(
141+
RESULTS
142+
+ "graphics/heatmap_timeseries/s_{clusters}_{opts}_{sector_opts}_{planning_horizons}",
143+
run=config["run"]["name"],
144+
**config["scenario"],
148145
),
149146
default_target: True
150147

@@ -202,7 +199,7 @@ rule rulegraph:
202199
r"""
203200
# Generate DOT file using nested snakemake with the dumped final config
204201
echo "[Rule rulegraph] Using final config file: {input.config_file}"
205-
snakemake --rulegraph all --configfile {input.config_file} --quiet | sed -n "/digraph/,\$p" > {output.dot}
202+
snakemake --rulegraph --configfile {input.config_file} --quiet | sed -n "/digraph/,\$p" > {output.dot}
206203
207204
# Generate visualizations from the DOT file
208205
if [ -s {output.dot} ]; then
@@ -415,7 +412,7 @@ rule prepare_district_heating_subnodes:
415412
cities="data/fernwaermeatlas/cities_geolocations.geojson",
416413
lau_regions="data/lau_regions.zip",
417414
census=storage(
418-
"https://www.zensus2022.de/static/Zensus_Veroeffentlichung/Zensus2022_Heizungsart.zip",
415+
"https://www.destatis.de/static/DE/zensus/gitterdaten/Zensus2022_Heizungsart.zip",
419416
keep_local=True,
420417
),
421418
osm_land_cover=storage(

config/config.de.yaml

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -51,6 +51,7 @@ iiasa_database:
5151
uba_for_mobility: false # For 2025–2035 use MWMS scenario from UBA Projektionsbericht 2025
5252
uba_for_industry: false # For 2025–2035 use MWMS scenario from UBA Projektionsbericht 2025
5353
scale_industry_non_energy: false # Scale non-energy industry demand directly proportional to energy demand
54+
5455
# docs in https://pypsa-eur.readthedocs.io/en/latest/configuration.html#foresight
5556
foresight: myopic
5657

@@ -316,6 +317,11 @@ sector:
316317
rural: true
317318
co2_spatial: true
318319
biomass_spatial: true
320+
ammonia: false
321+
methanol:
322+
methanol_to_power:
323+
ocgt: false
324+
biomass_to_methanol: false
319325
#relax so no infeasibility in 2050 with no land transport demand
320326
min_part_load_fischer_tropsch: 0.
321327
regional_oil_demand: true #set to true if regional CO2 constraints needed
@@ -326,6 +332,7 @@ sector:
326332
biogas_upgrading_cc: true
327333
biomass_to_liquid: true
328334
biomass_to_liquid_cc: true
335+
electrobiofuels: false
329336
cluster_heat_buses: true
330337
# calculated based on ariadne "Stock|Space Heating"
331338
# and then 2% of buildings renovated per year to reduce their demand by 80%
@@ -365,7 +372,6 @@ sector:
365372

366373
# docs in https://pypsa-eur.readthedocs.io/en/latest/configuration.html#industry
367374
industry:
368-
ammonia: false
369375
steam_biomass_fraction: 0.4
370376
steam_hydrogen_fraction: 0.3
371377
steam_electricity_fraction: 0.3

0 commit comments

Comments
 (0)