Skip to content

Commit 146725f

Browse files
authored
Merge pull request #83 from CosmoStat/dummy_main
Pre-release v2.0.0
2 parents 899a6e3 + 0dbb7f9 commit 146725f

File tree

419 files changed

+24590
-27122
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

419 files changed

+24590
-27122
lines changed

.github/workflows/cd.yml

Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
# This workflow will install Python dependencies, run tests and lint with a single version of Python
2+
# For more information see: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-python
3+
4+
name: CD
5+
6+
on:
7+
push:
8+
branches:
9+
- main
10+
11+
12+
jobs:
13+
docs:
14+
name: Deploy API documentation
15+
runs-on: [ubuntu-latest]
16+
17+
steps:
18+
- name: Checkout
19+
uses: actions/checkout@v3
20+
21+
- name: Set up Python 3.10.5
22+
uses: actions/setup-python@v3
23+
with:
24+
python-version: "3.10.5"
25+
26+
- name: Check Python Version
27+
run: python --version
28+
29+
- name: Install dependencies
30+
run: |
31+
python -m pip install ".[docs]"
32+
33+
- name: Build API documentation
34+
run: |
35+
sphinx-apidoc -Mfeo docs/source src/wf_psf
36+
sphinx-build docs/source docs/build
37+
38+
- name: Deploy API documentation
39+
uses: peaceiris/[email protected]
40+
with:
41+
github_token: ${{ secrets.GITHUB_TOKEN }}
42+
publish_dir: docs/build

.github/workflows/ci.yml

Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
# This workflow will install Python dependencies, run tests and lint with a single version of Python
2+
# For more information see: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-python
3+
4+
name: CI
5+
6+
on:
7+
pull_request:
8+
branches:
9+
- main
10+
11+
12+
jobs:
13+
test-full:
14+
runs-on: [ubuntu-latest]
15+
16+
steps:
17+
- name:
18+
uses: actions/checkout@v3
19+
20+
- name: Set up Python 3.10.5
21+
uses: actions/setup-python@v3
22+
with:
23+
python-version: "3.10.5"
24+
25+
- name: Install dependencies
26+
run: python -m pip install ".[test]"
27+
28+
- name: Test with pytest
29+
run: python -m pytest

.gitignore

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,7 @@ __pycache__/
55

66
# Remove method comparison data
77
method-comparison/compatible-datasets/*
8+
tf_notebooks
89

910
# Log files from slurm
1011
*.err
@@ -61,11 +62,13 @@ htmlcov/
6162
.coverage.*
6263
.cache
6364
nosetests.xml
65+
pytest.xml
6466
coverage.xml
6567
*.cover
6668
*.py,cover
6769
.hypothesis/
6870
.pytest_cache/
71+
src/wf_psf/pytest.xml
6972

7073
# Translations
7174
*.mo
@@ -86,6 +89,11 @@ instance/
8689

8790
# Sphinx documentation
8891
docs/_build/
92+
docs/source/wf_psf*.rst
93+
docs/source/_static/file.png
94+
docs/source/_static/images/logo_colab.png
95+
docs/source/_static/minus.png
96+
docs/source/_static/plus.png
8997

9098
# PyBuilder
9199
target/
@@ -143,3 +151,7 @@ dmypy.json
143151

144152
# Pyre type checker
145153
.pyre/
154+
155+
156+
# WF-PSF Debug
157+
/debug/

README.md

Lines changed: 5 additions & 97 deletions
Original file line numberDiff line numberDiff line change
@@ -3,103 +3,11 @@
33
<h1 align='center'>WaveDiff</h1>
44
<h2 align='center'>A differentiable data-driven wavefront-based PSF modelling framework.</h2>
55

6+
WaveDiff is a differentiable PSF modelling pipeline constructed with [Tensorflow](https://github.com/tensorflow/tensorflow). It was developed at the [CosmoStat lab](https://www.cosmostat.org) at CEA Paris-Saclay.
7+
8+
See the [documentation](https://cosmostat.github.io/wf-psf/) for details on how to install and run WaveDiff.
9+
610
This repository includes:
711
- A differentiable PSF model entirely built in [Tensorflow](https://github.com/tensorflow/tensorflow).
8-
- A numpy-based PSF simulator [here](https://github.com/tobias-liaudat/wf-psf/blob/main/wf_psf/SimPSFToolkit.py).
12+
- A [numpy-based PSF simulator](https://github.com/CosmoStat/wf-psf/tree/dummy_main/src/wf_psf/sims).
913
- All the scripts, jobs and notebooks required to reproduce the results in [arXiv:2203.04908](http://arxiv.org/abs/2203.04908) and [arXiv:2111.12541](https://arxiv.org/abs/2111.12541).
10-
11-
For more information on how to use the WaveDiff model through configurable scripts see the `long-runs` directory's [README](https://github.com/tobias-liaudat/wf-psf/blob/main/long-runs/README.md).
12-
13-
## Proposed framework
14-
15-
A schematic of the proposed framework can be seen below. The PSF model is estimated (trained) using star observations in the field-of-view.
16-
17-
<img height=300 src="assets/PSF_model_diagram_v6.png" >
18-
19-
<!-- Visual reconstruction example of the WaveDiff-original PSF model trained on a simplified Euclid-like setting.
20-
21-
<img height=800 src="assets/PSF_reconstruction_example.png" > -->
22-
23-
24-
## Install
25-
26-
`wf-psf` is pure python and can be easily installed with `pip`. After cloning the repository, run the following commands:
27-
28-
```bash
29-
$ cd wf-psf
30-
$ pip install .
31-
```
32-
33-
The package can then be imported in Python as `import wf_psf as wf`. We recommend using the release `1.2.0` for stability as the current main branch is under development.
34-
35-
## Requirements
36-
- [numpy](https://github.com/numpy/numpy) [>=1.19.2]
37-
- [scipy](https://github.com/scipy/scipy) [>=1.5.2]
38-
- [TensorFlow](https://www.tensorflow.org/) [==2.4.1]
39-
- [TensorFlow Addons](https://github.com/tensorflow/addons) [==0.12.1]
40-
- [Astropy](https://github.com/astropy/astropy) [==4.2]
41-
- [zernike](https://github.com/jacopoantonello/zernike) [==0.0.31]
42-
- [opencv-python](https://github.com/opencv/opencv-python) [>=4.5.1.48]
43-
- [pillow](https://github.com/python-pillow/Pillow) [>=8.1.0]
44-
- [galsim](https://github.com/GalSim-developers/GalSim) [>=2.3.1]
45-
46-
Optional packages:
47-
- [matplotlib](https://github.com/matplotlib/matplotlib) [=3.3.2]
48-
- [seaborn](https://github.com/mwaskom/seaborn) [>=0.11]
49-
50-
51-
## Reproducible research
52-
53-
#### [arXiv:2203.04908](http://arxiv.org/abs/2203.04908) Rethinking data-driven point spread function modeling with a differentiable optical model (2022)
54-
_Submitted._
55-
56-
- Use the release 1.2.0.
57-
- All the scripts, jobs and notebooks to reproduce the figures from the article can be found [here](https://github.com/tobias-liaudat/wf-psf/tree/main/papers/article_IOP).
58-
- The trained PSF models are found [here](https://github.com/tobias-liaudat/wf-psf/tree/main/papers/article_IOP/data/models).
59-
- The input PSF field can be found [here](https://github.com/tobias-liaudat/wf-psf/tree/main/data).
60-
- The script used to generate the input PSF field is [this one](https://github.com/tobias-liaudat/wf-psf/blob/main/long-runs/LR-PSF-field-gen-coherentFields.py).
61-
- The code required to run the comparison against pixel-based PSF models is in [this directory](https://github.com/tobias-liaudat/wf-psf/tree/main/method-comparison).
62-
- The training of the models was done using [this script](https://github.com/tobias-liaudat/wf-psf/blob/main/long-runs/train_eval_plot_script_click.py). In order to match the script's option for the different models with the article you should follow:
63-
- `poly->WaveDiff-original`
64-
- `graph->WaveDiff-graph`
65-
- `mccd->WaveDiff-Polygraph`
66-
67-
_Note: To run the comparison to other PSF models you need to install them first. See [RCA](https://github.com/CosmoStat/rca), [PSFEx](https://github.com/astromatic/psfex) and [MCCD](https://github.com/CosmoStat/mccd)._
68-
69-
70-
#### [arXiv:2111.12541](https://arxiv.org/abs/2111.12541) Rethinking the modeling of the instrumental response of telescopes with a differentiable optical model (2021)
71-
_NeurIPS 2021 Workshop on Machine Learning and the Physical Sciences._
72-
73-
- Use the release 1.2.0.
74-
- All the scripts, jobs and notebooks to reproduce the figures from the article can be found [here](https://github.com/tobias-liaudat/wf-psf/tree/main/papers/Neurips2021_ML4Physics_workshop).
75-
76-
77-
78-
## Citation
79-
80-
If you use `wf-psf` in a scientific publication, we would appreciate citations to the following paper:
81-
82-
*Rethinking data-driven point spread function modeling with a differentiable optical model*, T. Liaudat, J.-L. Starck, M. Kilbinger, P.-A. Frugier, [arXiv:2203.04908](http://arxiv.org/abs/2203.04908), 2022.
83-
84-
85-
The BibTeX citation is the following:
86-
```
87-
@misc{https://doi.org/10.48550/arxiv.2203.04908,
88-
doi = {10.48550/ARXIV.2203.04908},
89-
90-
url = {https://arxiv.org/abs/2203.04908},
91-
92-
author = {Liaudat, Tobias and Starck, Jean-Luc and Kilbinger, Martin and Frugier, Pierre-Antoine},
93-
94-
keywords = {Instrumentation and Methods for Astrophysics (astro-ph.IM), Computer Vision and Pattern Recognition (cs.CV), FOS: Physical sciences, FOS: Physical sciences, FOS: Computer and information sciences, FOS: Computer and information sciences},
95-
96-
title = {Rethinking data-driven point spread function modeling with a differentiable optical model},
97-
98-
publisher = {arXiv},
99-
100-
year = {2022},
101-
102-
copyright = {arXiv.org perpetual, non-exclusive license}
103-
}
104-
```
105-

config/configs.yaml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
---
2+
training_conf: training_config.yaml

config/data_config.yaml

Lines changed: 44 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,44 @@
1+
# Training and test data sets for training and/or metrics evaluation
2+
data:
3+
training:
4+
# Specify directory path to data; Default setting is /path/to/repo/data
5+
data_dir: data/coherent_euclid_dataset/
6+
file: train_Euclid_res_200_TrainStars_id_001.npy
7+
# if training data set file does not exist, generate a new one by setting values below
8+
stars: null
9+
positions: null
10+
SEDS: null
11+
zernike_coef: null
12+
C_poly: null
13+
params: #
14+
d_max: 2
15+
max_order: 45
16+
x_lims: [0, 1000.0]
17+
y_lims: [0, 1000.0]
18+
grid_points: [4, 4]
19+
n_bins: 20
20+
max_wfe_rms: 0.1
21+
oversampling_rate: 3.0
22+
output_Q: 3.0
23+
output_dim: 32
24+
LP_filter_length: 2
25+
pupil_diameter: 256
26+
euclid_obsc: true
27+
n_stars: 200
28+
test:
29+
data_dir: data/coherent_euclid_dataset/
30+
file: test_Euclid_res_id_001.npy
31+
# If test data set file not provided produce a new one
32+
stars: null
33+
noisy_stars: null
34+
positions: null
35+
SEDS: null
36+
zernike_coef: null
37+
C_poly: null
38+
parameters:
39+
d_max: 2
40+
max_order: 45
41+
x_lims: [0, 1000.0]
42+
y_lims: [0, 1000.0]
43+
grid_points: [4,4]
44+
max_wfe_rms: 0.1

config/logging.conf

Lines changed: 33 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,33 @@
1+
[loggers]
2+
keys=root, simpleLogger
3+
4+
[handlers]
5+
keys=consoleHandler, fileHandler
6+
7+
[formatters]
8+
keys=simpleFormatter
9+
10+
[logger_root]
11+
level=DEBUG
12+
handlers=consoleHandler, fileHandler
13+
14+
[logger_simpleLogger]
15+
level=DEBUG
16+
handlers=consoleHandler, fileHandler
17+
qualname=simpleLogger
18+
propagate=0
19+
20+
[handler_consoleHandler]
21+
class=StreamHandler
22+
level=DEBUG
23+
formatter=simpleFormatter
24+
args=(sys.stdout,)
25+
26+
[handler_fileHandler]
27+
class=FileHandler
28+
level=DEBUG
29+
formatter=simpleFormatter
30+
args=('%(filename)s', 'w')
31+
32+
[formatter_simpleFormatter]
33+
format=%(asctime)s - %(name)s - %(levelname)s - %(message)s

0 commit comments

Comments
 (0)