Skip to content

Commit f4ec7bb

Browse files
committed
Merge branch 'develop' into 'master'
Develop into master See merge request iek-3/shared-code/fine!326
2 parents a8f4a11 + 4666bc5 commit f4ec7bb

22 files changed

+322
-160
lines changed

.github/workflows/test.yml

Lines changed: 86 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,86 @@
1+
on:
2+
push:
3+
branches:
4+
- master
5+
- develop
6+
pull_request:
7+
branches:
8+
- master
9+
- develop
10+
# Allows to trigger the workflow manually
11+
workflow_dispatch:
12+
branches:
13+
- master
14+
- develop
15+
schedule:
16+
# * is a special character in YAML so you have to quote this string
17+
# Some Examples for cron syntax https://crontab.guru/examples.html
18+
# Schedules job at any point after 12 pm
19+
- cron: '0 0 * * *'
20+
# Weekly after sunday
21+
# - cron: 0 0 * * 0
22+
23+
jobs:
24+
TestFineSingle:
25+
name: Ex1 (${{ matrix.python-version }}, ${{ matrix.os }})
26+
runs-on: ${{ matrix.os }}
27+
strategy:
28+
fail-fast: false
29+
matrix:
30+
os: ["ubuntu-latest", "macos-latest", "windows-latest"]
31+
steps:
32+
- name: Checkout
33+
uses: actions/checkout@v4
34+
with:
35+
repository: FZJ-IEK3-VSA/FINE
36+
path: './fine'
37+
- uses: conda-incubator/setup-miniconda@v3
38+
with:
39+
miniforge-version: latest
40+
channels: conda-forge
41+
activate-environment: test_env
42+
- name: Run tests
43+
shell: pwsh
44+
run: |
45+
ls
46+
echo "LS Done"
47+
mamba install fine pytest
48+
echo "Installation done"
49+
conda list
50+
echo "libaries printed"
51+
echo "start pytest"
52+
pytest
53+
echo "Pytest done"
54+
55+
TestFineDevLocal:
56+
name: Ex1 (${{ matrix.python-version }}, ${{ matrix.os }})
57+
runs-on: ${{ matrix.os }}
58+
strategy:
59+
fail-fast: false
60+
matrix:
61+
os: ["ubuntu-latest", "macos-latest", "windows-latest"]
62+
steps:
63+
- name: Checkout
64+
uses: actions/checkout@v4
65+
with:
66+
repository: FZJ-IEK3-VSA/FINE
67+
path: './fine'
68+
- uses: conda-incubator/setup-miniconda@v3
69+
with:
70+
miniforge-version: latest
71+
channels: conda-forge
72+
activate-environment: test_env
73+
- name: Run tests
74+
shell: pwsh
75+
run: |
76+
ls
77+
echo "LS Done"
78+
cd fine
79+
mamba env create --name fine_env --yes --file requirements_dev.yml
80+
conda run --name fine_env pip install . --no-deps
81+
echo "Installation done"
82+
conda list --name fine_env
83+
echo "libaries printed"
84+
echo "start pytest"
85+
conda run --name fine_env pytest
86+
echo "Pytest done"

.gitlab-ci.yml

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -30,18 +30,20 @@ variables:
3030
policy: pull
3131
before_script:
3232
- micromamba install -n base -y --file=requirements_dev.yml
33+
- python -m pip install .
3334
rules:
3435
# Switch from branch pipeline to merge pipeline once a merge request has
3536
# been created on the branch.
3637
- if: $CI_COMMIT_BRANCH && $CI_OPEN_MERGE_REQUESTS && $CI_PIPELINE_SOURCE == "push"
3738
when: never
3839
- if: $CI_COMMIT_BRANCH && $CI_OPEN_MERGE_REQUESTS
3940
when: never
41+
retry: 1
4042

4143

4244
.test_docker_template_noupdate:
4345
stage: test
44-
image: jugit-registry.fz-juelich.de/iek-3/shared-code/fine/fine-dev:latest
46+
image: jugit-registry.fz-juelich.de/iek-3/shared-code/fine/fine-dev:latest
4547
before_script:
4648
- python -m pip install -e .
4749
rules:
@@ -68,7 +70,7 @@ variables:
6870

6971
.test_docker_template:
7072
stage: test
71-
image: jugit-registry.fz-juelich.de/iek-3/shared-code/fine/fine-dev:latest
73+
image: jugit-registry.fz-juelich.de/iek-3/shared-code/fine/fine-dev:latest
7274
variables:
7375
CONDA_PKGS_DIRS: "$CI_PROJECT_DIR/.cache/pkgs"
7476
cache:
@@ -83,6 +85,7 @@ variables:
8385
policy: pull
8486
before_script:
8587
- micromamba install -n base -y --file=requirements_dev.yml
88+
- python -m pip install .
8689
rules:
8790
# Do not run for pushes to master or develop and for merge requests to master
8891
- if: $CI_COMMIT_BRANCH == "master"
@@ -114,7 +117,7 @@ variables:
114117
test-code:
115118
extends: .test_template
116119
script:
117-
- python -m pytest --cov=fine test/
120+
- python -m pytest -n auto --cov=fine test/
118121
rules:
119122
# Run only for pushes to master or develop and for merge requests to master.
120123
# Do not run for scheduled pushes to master (runs `test-code-push-cache` instead).
@@ -129,20 +132,20 @@ test-code-push-cache:
129132
cache:
130133
policy: push
131134
script:
132-
- python -m pytest --cov=fine test/
135+
- python -m pytest -n auto --cov=fine test/
133136
rules:
134137
# Run only for scheduled pushes to master.
135138
- if: '$CI_PIPELINE_SOURCE == "schedule" && $CI_COMMIT_BRANCH == "master"'
136139

137140
test-code-docker:
138141
extends: .test_docker_template
139142
script:
140-
- python -m pytest --cov=fine test/
143+
- python -m pytest -n auto --cov=fine test/
141144

142145
test-code-docker-noupdate:
143146
extends: .test_docker_template_noupdate
144147
script:
145-
- python -m pytest --cov=fine test/
148+
- python -m pytest -n auto --cov=fine test/
146149

147150
test-notebooks:
148151
extends: .test_template

README.md

Lines changed: 35 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -10,18 +10,35 @@
1010

1111
# ETHOS.FINE - Framework for Integrated Energy System Assessment
1212

13+
1314
The ETHOS.FINE python package provides a framework for modeling, optimizing and assessing energy systems. With the provided framework, systems with multiple regions, commodities and time steps can be modeled. Target of the optimization is the minimization of the total annual cost while considering technical and environmental constraints. Besides using the full temporal resolution, an interconnected typical period storage formulation can be applied, that reduces the complexity and computational time of the model.
1415

16+
This readme provides information on the installation of the package. For further information have a look at the [documentation](https://vsa-fine.readthedocs.io/en/latest/).
17+
1518
ETHOS.FINE is used for the modelling of a diverse group of optimization problems within the [Energy Transformation PatHway Optimization Suite (ETHOS) at IEK-3](https://www.fz-juelich.de/de/iek/iek-3/leistungen/model-services).
1619

1720
If you want to use ETHOS.FINE in a published work, please [**kindly cite following publication**](https://www.sciencedirect.com/science/article/pii/S036054421830879X) which gives a description of the first stages of the framework. The python package which provides the time series aggregation module and its corresponding literature can be found [here](https://github.com/FZJ-IEK3-VSA/tsam).
1821

19-
## Features
20-
* representation of an energy system by multiple locations, commodities and time steps
21-
* complexity reducing storage formulation based on typical periods
22+
## Content
23+
<!-- TOC -->
24+
* [Requirements](#requirements)
25+
* [Python package manager](#python-package-manager)
26+
* [Mixed Integer Linear Programming (MILP) solver](#mixed-integer-linear-programming-milp-solver)
27+
* [Installation](#installation)
28+
* [Installation via conda-forge](#installation-via-conda-forge)
29+
* [Installation from a local folder](#installation-from-a-local-folder)
30+
* [Installation for developers](#installation-for-developers)
31+
* [Installation of an optimization solver](#installation-of-an-optimization-solver)
32+
* [Gurobi installation](#gurobi-installation)
33+
* [GLPK installation](#glpk-installation)
34+
* [CBC](#cbc)
35+
* [Examples](#examples)
36+
* [License](#license)
37+
* [About Us](#about-us-)
38+
* [Contributions and Users](#contributions-and-users)
39+
* [Acknowledgement](#acknowledgement)
40+
<!-- TOC -->
2241

23-
## Documentation
24-
A "Read the Docs" documentation of ETHOS.FINE can be found [here](https://vsa-fine.readthedocs.io/en/latest/).
2542

2643
## Requirements
2744

@@ -46,7 +63,7 @@ mamba create -n fine -c conda-forge fine
4663
### Installation from a local folder
4764
Alternatively you can first clone the content of this repository and perform the installation from there:
4865

49-
1. Clone the content of this repository
66+
1. (Shallow) clone the content of this repository
5067
```bash
5168
git clone --depth 1 https://github.com/FZJ-IEK3-VSA/FINE.git
5269
```
@@ -62,14 +79,25 @@ mamba env create -f requirements.yml
6279
```bash
6380
mamba activate fine
6481
```
82+
6. Install FINE with:
83+
```bash
84+
python -m pip install --no-deps .
85+
```
6586

6687
### Installation for developers
6788
If you want to work on the FINE codebase you need to run.
6889
```bash
90+
git clone https://github.com/FZJ-IEK3-VSA/FINE.git
91+
```
92+
to get the whole git history and then
93+
```bash
6994
mamba env create -f requirements_dev.yml
7095
```
7196
This installs additional dependencies such as `pytest` and installs FINE from the folder in editable mode with `pip -e`. Changes in the folder are then reflected in the package installation.
72-
97+
Finally, install FINE in editable mode with:
98+
```bash
99+
python -m pip install --no-deps --editable .
100+
```
73101
Test your installation with the following command in the project root folder:
74102
```
75103
pytest

docs/source/newsDoc.rst

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,8 @@
22
FINE's News Feed
33
################
44

5+
Since version 2.3.3 this news feed is not updated anymore. Please refer to the [release page](https://github.com/FZJ-IEK3-VSA/FINE/releases) for changelogs.
6+
57
*********************
68
Release version 2.3.2
79
*********************

fine/IOManagement/standardIO.py

Lines changed: 15 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -95,7 +95,7 @@ def writeOptimizationOutputToExcel(
9595
if not optSum.empty:
9696
optSum.to_excel(
9797
writer,
98-
name[:-5]
98+
sheet_name=name[:-5]
9999
+ "OptSummary_"
100100
+ esM.componentModelingDict[name].dimension,
101101
)
@@ -123,7 +123,7 @@ def writeOptimizationOutputToExcel(
123123
((dfTD1dim != 0) & (~dfTD1dim.isnull())).any(axis=1)
124124
]
125125
if not dfTD1dim.empty:
126-
dfTD1dim.to_excel(writer, name[:-5] + "_TDoptVar_1dim")
126+
dfTD1dim.to_excel(writer, sheet_name=name[:-5] + "_TDoptVar_1dim")
127127
if dataTD2dim:
128128
names = ["Variable", "Component", "LocationIn", "LocationOut"]
129129
dfTD2dim = pd.concat(dataTD2dim, keys=indexTD2dim, names=names)
@@ -132,7 +132,7 @@ def writeOptimizationOutputToExcel(
132132
((dfTD2dim != 0) & (~dfTD2dim.isnull())).any(axis=1)
133133
]
134134
if not dfTD2dim.empty:
135-
dfTD2dim.to_excel(writer, name[:-5] + "_TDoptVar_2dim")
135+
dfTD2dim.to_excel(writer, sheet_name=name[:-5] + "_TDoptVar_2dim")
136136
if dataTI:
137137
if esM.componentModelingDict[name].dimension == "1dim":
138138
names = ["Variable type", "Component"]
@@ -144,7 +144,7 @@ def writeOptimizationOutputToExcel(
144144
if not dfTI.empty:
145145
dfTI.to_excel(
146146
writer,
147-
name[:-5]
147+
sheet_name=name[:-5]
148148
+ "_TIoptVar_"
149149
+ esM.componentModelingDict[name].dimension,
150150
)
@@ -154,7 +154,7 @@ def writeOptimizationOutputToExcel(
154154
periodsOrder = pd.DataFrame(
155155
[esM.periodsOrder[_ip]], index=["periodsOrder"], columns=esM.periods
156156
)
157-
periodsOrder.to_excel(writer, "Misc")
157+
periodsOrder.to_excel(writer, sheet_name="Misc")
158158
if esM.segmentation:
159159
ls = []
160160
for i in esM.periodsOrder[_ip].tolist():
@@ -163,7 +163,7 @@ def writeOptimizationOutputToExcel(
163163
columns={"Segment Duration": "timeStepsPerSegment"}
164164
)
165165
segmentDuration.index.name = "segmentNumber"
166-
segmentDuration.to_excel(writer, "Misc", startrow=3)
166+
segmentDuration.to_excel(writer, sheet_name="Misc", startrow=3)
167167
utils.output("\tSaving file...", esM.verbose, 0)
168168
writer.close()
169169
utils.output("Done. (%.4f" % (time.time() - _t) + " sec)", esM.verbose, 0)
@@ -809,7 +809,7 @@ def plotLocations(
809809
locationsShapeFileName,
810810
indexColumn,
811811
plotLocNames=False,
812-
crs="epsg:3035",
812+
crs="EPSG:3035",
813813
faceColor="none",
814814
edgeColor="black",
815815
fig=None,
@@ -840,7 +840,7 @@ def plotLocations(
840840
:type plotLocNames: boolean
841841
842842
:param crs: coordinate reference system
843-
|br| * the default value is 'epsg:3035'
843+
|br| * the default value is 'EPSG:3035'
844844
:type crs: string
845845
846846
:param faceColor: face color of the plot
@@ -884,7 +884,7 @@ def plotLocations(
884884
:type dpi: scalar > 0
885885
"""
886886

887-
gdf = gpd.read_file(locationsShapeFileName).to_crs({"init": crs})
887+
gdf = gpd.read_file(locationsShapeFileName).to_crs(crs)
888888

889889
if ax is None:
890890
fig, ax = plt.subplots(1, 1, figsize=figsize, **kwargs)
@@ -919,7 +919,7 @@ def plotTransmission(
919919
loc0,
920920
loc1,
921921
ip=0,
922-
crs="epsg:3035",
922+
crs="EPSG:3035",
923923
variableName="capacityVariablesOptimum",
924924
color="k",
925925
loc=7,
@@ -961,7 +961,7 @@ def plotTransmission(
961961
:type ip: int
962962
963963
:param crs: coordinate reference system
964-
|br| * the default value is 'epsg:3035'
964+
|br| * the default value is 'EPSG:3035'
965965
:type crs: string
966966
967967
:param variableName: parameter for plotting installed capacity ('_capacityVariablesOptimum') or operation
@@ -1024,7 +1024,7 @@ def plotTransmission(
10241024
if capMax == 0:
10251025
return fig, ax
10261026
cap = cap / capMax
1027-
gdf = gpd.read_file(transmissionShapeFileName).to_crs({"init": crs})
1027+
gdf = gpd.read_file(transmissionShapeFileName).to_crs(crs)
10281028

10291029
if ax is None:
10301030
fig, ax = plt.subplots(1, 1, figsize=figsize, **kwargs)
@@ -1082,7 +1082,7 @@ def plotLocationalColorMap(
10821082
ip=0,
10831083
perArea=True,
10841084
areaFactor=1e3,
1085-
crs="epsg:3035",
1085+
crs="EPSG:3035",
10861086
variableName="capacityVariablesOptimum",
10871087
doSum=False,
10881088
cmap="viridis",
@@ -1128,7 +1128,7 @@ def plotLocationalColorMap(
11281128
:type areaFactor: scalar > 0
11291129
11301130
:param crs: coordinate reference system
1131-
|br| * the default value is 'epsg:3035'
1131+
|br| * the default value is 'EPSG:3035'
11321132
:type crs: string
11331133
11341134
:param variableName: parameter for plotting installed capacity ('_capacityVariablesOptimum') or operation
@@ -1184,7 +1184,7 @@ def plotLocationalColorMap(
11841184

11851185
if doSum:
11861186
data = data.sum(axis=1)
1187-
gdf = gpd.read_file(locationsShapeFileName).to_crs({"init": crs})
1187+
gdf = gpd.read_file(locationsShapeFileName).to_crs(crs)
11881188

11891189
# Make sure the data and gdf indices match
11901190
## 1. Sort the indices to obtain same order

0 commit comments

Comments
 (0)