Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
28 changes: 12 additions & 16 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,23 +7,21 @@
[![Ruff](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json)](https://github.com/astral-sh/ruff)
[![Contributions welcome](https://img.shields.io/badge/contributions-welcome-brightgreen.svg?style=flat)](https://github.com/synthesizer-project/synference/blob/main/docs/CONTRIBUTING.md)
[![License: GPLv3](https://img.shields.io/badge/License-GPLv3-blue.svg)](https://www.gnu.org/licenses/gpl-3.0)

<!--[![Documentation Status](https://github.com/synthesizer-project/synference/actions/workflows/static.yml/badge.svg)](https://synthesizer-project.github.io/synference/)-->
[![Documentation Status](https://github.com/synthesizer-project/synference/actions/workflows/docs.yml/badge.svg)](https://synthesizer-project.github.io/synference/)

### Overview

Synference is a Python package designed to fit to perform simulation-based inference (SBI, also known as likelihood free inference) SED fitting. It integrates with [Synthesizer](https://synthesizer-project.github.io) for flexible and fast generation of mock spectra and photometry,
and uses the [LtU-ILI](https://ltu-ili.readthedocs.io/) package for fast, amortised posterior inference.
Synference is a Python package designed to fit to perform simulation-based inference (SBI, also known as likelihood free inference) SED fitting. It integrates with [Synthesizer](https://synthesizer-project.github.io) for flexible and fast generation of mock spectra and photometry, and uses the [LtU-ILI](https://ltu-ili.readthedocs.io/) package for fast, amortised posterior inference.

### Key Features

- **Flexible Mock Generation**: Generate mock spectra and photometry using Synthesizer, allowing for full flexibility in almost every aspect of SED generation, including a wide range of post-processed stellar population synthesis grids.
- **Flexible Training**: Mock photometry creation is seperated from the training of the inference model, allowing for flexible training strategies and the use of different inference models, as well as quickly switching between different feature sets - e.g. different filtersets, different noise models, etc.
- **Flexible Training**: Mock photometry creation is separated from the training of the inference model, allowing for flexible training strategies and the use of different inference models, as well as quickly switching between different feature sets - e.g. different filtersets, different noise models, etc.
- **Fast Inference**: Leverage LtU-ILI for fast, amortised posterior inference, enabling efficient fitting of complex models to data.

### Requirements

synference requires Python 3.10 or higher. It has the following dependencies:
Synference requires Python 3.10 or higher. It has the following dependencies:

- [synthesizer](https://synthesizer-project.github.io) for mock generation
- [ltu-ili](https://ltu-ili.readthedocs.io/) for inference
Expand All @@ -39,11 +37,11 @@ synference requires Python 3.10 or higher. It has the following dependencies:
- [tqdm](https://tqdm.github.io/) for progress bars
- [jax](https://jax.readthedocs.io/) for GPU acceleration (optional, for some inference models)

These dependencies will be automatically installed when you install synference using pip.
These dependencies will be automatically installed when you install Synference using pip.

### Installation

The easiest way to currently install synference is to clone the repository and install it in editable mode:
The easiest way to currently install Synference is to clone the repository and install it in editable mode:

```bash
git clone https://www.github.com/synthesizer-project/synference.git
Expand All @@ -53,7 +51,7 @@ pip install -e .

### Getting Started

To get started with synference, you can check out the [examples](examples/) directory for usage examples and tutorials. The examples cover various aspects of using synference, including:
To get started with Synference, you can check out the [examples](examples/) directory for usage examples and tutorials. The examples cover various aspects of using Synference, including:

- Generating mock spectra and photometry with Synthesizer
- Training inference models with LtU-ILI
Expand All @@ -64,7 +62,7 @@ To get started with synference, you can check out the [examples](examples/) dire

The most basic usage, for creating a simple mock catalogue and training a model on it looks like this:

Firstly we setup the Synthesizer based model. More details on how to set up the Synthesizer model can be found in the [Synthesizer documentation](https://synthesizer-project.github.io/). Here we use a BPASS SPS grid, a lognormal star formation history, a single stellar metallicity and a simple emission model including Cloudy nebular emission but no dust reprocessing. The photometric filters used are common JWST/NIRCam wideband filters, but any filters supported by [SVO](https://svo2.cab.inta-csic.es/theory/fps/index.php) or loaded manually can be used. The model parameters are drawn from a Latin hypercube sampling of the parameter space, but this can be done in any way indepedent of synference. All we are providing to the grid generation is a set of *10,000* galaxies with a range of stellar masses, redshifts, metallicities, and star formation histories, and these can be created in any way you like.
Firstly we setup the Synthesizer based model. More details on how to set up the Synthesizer model can be found in the [Synthesizer documentation](https://synthesizer-project.github.io/). Here we use a BPASS SPS grid, a lognormal star formation history, a single stellar metallicity and a simple emission model including Cloudy nebular emission but no dust reprocessing. The photometric filters used are common JWST/NIRCam wideband filters, but any filters supported by [SVO](https://svo2.cab.inta-csic.es/theory/fps/index.php) or loaded manually can be used. The model parameters are drawn from a Latin hypercube sampling of the parameter space, but this can be done in any way independent of Synference. All we are providing to the grid generation is a set of *10,000* galaxies with a range of stellar masses, redshifts, metallicities, and star formation histories, and these can be created in any way you like.

```python
from synthesizer.grid import Grid
Expand Down Expand Up @@ -156,17 +154,15 @@ empirical_model_fitter.recover_SED(observed_data_vector)

```

This is just a basic example to get you started. synference is highly flexible and can be adapted to a wide range of use cases in simulation-based inference for SED fitting.
This is just a basic example to get you started. Synference is highly flexible and can be adapted to a wide range of use cases in simulation-based inference for SED fitting.

### Documentation

Work in progress.
Documentation for Synference is available at [synthesizer-project.github.io/synference](https://synthesizer-project.github.io/synference/). The documentation includes installation instructions, tutorials, API references, and examples to help you get started with using Synference for your own projects.

### Contributing

We welcome contributions to synference! If you have suggestions, bug reports, or would like to contribute code, please open an issue or submit a pull request on GitHub. Please see our [Code of Conduct](CODE_OF_CONDUCT.md) for more details on how to contribute and interact with the community.
We welcome contributions to Synference! If you have suggestions, bug reports, or would like to contribute code, please open an issue or submit a pull request on GitHub. Please see our [Code of Conduct](CODE_OF_CONDUCT.md) for more details on how to contribute and interact with the community.

### License
This project is licensed under the GNU General Public License v3.0 (GPLv3). See the [LICENSE](LICENSE) file for details. This means you can use, modify, and distribute the code freely, but any derivative works must also be open source and distributed under the same license. We warn users that this software is offered "as is", without any warranty or guarantee of fitness for a particular purpose. synference is under active development, and therefore may change in the future.


This project is licensed under the GNU General Public License v3.0 (GPLv3). See the [LICENSE](LICENSE) file for details. This means you can use, modify, and distribute the code freely, but any derivative works must also be open source and distributed under the same license. We warn users that this software is offered "as is", without any warranty or guarantee of fitness for a particular purpose. Synference is under active development, and therefore may change in the future.
2 changes: 1 addition & 1 deletion docs/CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ This will build a local copy of the documentation representative of the currentl

## Contributing to the Documentation

The synference documentation is written in a combination of reStructuredText, Jupyter notebooks and Python scripts.
The Synference documentation is written in a combination of reStructuredText, Jupyter notebooks and Python scripts.
Adding content should be relatively simple if you follow the instructions below.

### Adding notebooks
Expand Down
25 changes: 4 additions & 21 deletions docs/source/advanced_topics/simformer.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -77,16 +77,8 @@
"\n",
"Like all other synference models we can also set the ```training_batch_size```, ```validation_fraction```, ```stop_after_epochs``` and ```clip_max_norm``` parameters.\n",
"\n",
"```python"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "068c4c59",
"metadata": {},
"outputs": [],
"source": [
"```python\n",
"\n",
"fitter.run_single_simformer(\n",
" name_append=\"simformer_test\",\n",
" sde_type=\"ve\",\n",
Expand All @@ -106,17 +98,8 @@
" validation_fraction=0.1,\n",
" stop_after_epochs=30,\n",
" plot=False, # Currently the LtU-ILI plotting doesn't work with the simformer\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "e527bead",
"metadata": {},
"outputs": [],
"source": [
"?fitter.posteriors"
")\n",
"```"
]
}
],
Expand Down
2 changes: 1 addition & 1 deletion docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,7 @@
"sidebar_hide_name": True,
}

html_title = "synference"
html_title = "Synference"

ignore_pattern = ["basic_model.py"]

Expand Down
6 changes: 3 additions & 3 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Synference is an open-source python package for SED fitting of photometric and s
This documentation provides a broad overview of the various components in Synference and how they interact.
The `Getting Started guide <getting_started/getting_started>`_ contains download and installation instructions, as well as an overview of the code.

For detailed examples of what synference can do, take a look at the `Examples <auto_examples/index>`_ page.
For detailed examples of what Synference can do, take a look at the `Examples <auto_examples/index>`_ page.
A full description of the code base is provided in the `API <API>`_.

Contents
Expand All @@ -29,7 +29,7 @@ Contents
Citation & Acknowledgement
--------------------------

Please cite **all** of the following papers if you use synference in your research. Harvey et al. 2025 (in prep.) introduces the synference code, `Lovell et al. 2025 <https://ui.adsabs.harvard.edu/abs/2025arXiv250803888L/abstract>`_ & `Roper et al. 2025 <https://ui.adsabs.harvard.edu/abs/2025arXiv250615811R/abstract>`_ introduce the synthesizer code, and `Ho et al. 2024 <https://ui.adsabs.harvard.edu/abs/2024OJAp....7E..54H/abstract>`_ introduces the LtU-ILI framework.
Please cite **all** of the following papers if you use Synference in your research. Harvey et al. 2025 (in prep.) introduces the Synference code, `Lovell et al. 2025 <https://ui.adsabs.harvard.edu/abs/2025arXiv250803888L/abstract>`_ & `Roper et al. 2025 <https://ui.adsabs.harvard.edu/abs/2025arXiv250615811R/abstract>`_ introduce the Synthesizer code, and `Ho et al. 2024 <https://ui.adsabs.harvard.edu/abs/2024OJAp....7E..54H/abstract>`_ introduces the LtU-ILI framework.

.. code-block:: bibtex

Expand Down Expand Up @@ -97,4 +97,4 @@ Primary Contributors
License
-------

synference is free software made available under the GNU General Public License v3.0. For details see the `LICENSE <https://github.com/synthesizer-project/synference/blob/main/LICENSE.md>`_.
Synference is free software made available under the GNU General Public License v3.0. For details see the `LICENSE <https://github.com/synthesizer-project/synference/blob/main/LICENSE.md>`_.
10 changes: 5 additions & 5 deletions docs/source/library_gen/basic_library_generation.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -428,16 +428,16 @@
"1. Created the individual galaxies, setting the attributes as we described.\n",
"2. Instantiated a `Pipeline` and passed in the array of galaxies.\n",
"3. Ran the pipeline for the galaxies to generate the observed spectroscopy and photometry, which is stored in a HDF5 file.\n",
"4. Loaded the saved HDF5 file, extracted the photometry for our emission key 'emergent', and generated the photometry and input parameter grids.\n",
"5. Saved these separate grids into a different HDF5 file, which will be used by the code later."
"4. Loaded the saved HDF5 file, extracted the photometry for our emission key 'emergent', and generated the photometry and input parameter libraries.\n",
"5. Saved these separate libraries into a different HDF5 file, which will be used by the code later."
]
},
{
"cell_type": "markdown",
"id": "58d8c5f5",
"metadata": {},
"source": [
"## Inspecting our Grid\n",
"## Inspecting our Library\n",
"\n",
"There are numerous packages for inspecting HDF5 files, including the `h5py` package, which you have installed if you have run the code to this point without crashing. For more visual views, we recommend [H5Web](https://h5web.panosc.eu/), which has a VS Code extension, or the command line interface `h5forest`, which you can find on Github [here](https://github.com/WillJRoper/h5forest).\n",
"\n",
Expand All @@ -453,9 +453,9 @@
"source": [
"import h5py\n",
"\n",
"from synference import grid_folder\n",
"from synference import library_folder\n",
"\n",
"with h5py.File(f\"{grid_folder}/test_model_library.hdf5\") as f:\n",
"with h5py.File(f\"{library_folder}/test_model_library.hdf5\") as f:\n",
" for dataset in f:\n",
" print(f\"- {dataset}\")\n",
" for array in f[dataset]:\n",
Expand Down
2 changes: 1 addition & 1 deletion docs/source/library_gen/complex_library_generation.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -22,10 +22,10 @@
"source": [
"import numpy as np\n",
"from astropy.cosmology import Planck18 as cosmo\n",
"from synthesizer.emission_models import Greybody # noqa\n",
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

Verify the simplified import path for Greybody.

The import has been updated to use the top-level package path. Ensure that Greybody is exported from synthesizer.emission_models and this import path is valid.


🏁 Script executed:

#!/bin/bash
# Check if Greybody is available at the top-level emission_models module
rg -n "class Greybody" --type=py -C2
rg -n "from.*Greybody" --type=py -C2

Length of output: 3861





🏁 Script executed:

#!/bin/bash
# Check the emission_models __init__.py to see if Greybody is re-exported
find . -path "*/emission_models/__init__.py" -type f | head -5
rg -n "Greybody" --type=py $(find . -path "*/emission_models/__init__.py" -type f | head -1) -C3

Length of output: 10609


🏁 Script executed:

#!/bin/bash
# Also check if there's an __all__ export list in emission_models
rg -n "__all__" --type=py --glob="*/emission_models/__init__.py" -C5

Length of output: 148


🏁 Script executed:

#!/bin/bash
# Check the notebook content directly to confirm the import statement
cat docs/source/library_gen/complex_library_generation.ipynb | head -100

Length of output: 3219



---

🏁 Script executed:

```shell
#!/bin/bash
# Find the actual emission_models __init__.py file in synthesizer package
fd "emission_models" --type d | head -10

Length of output: 0


🏁 Script executed:

#!/bin/bash
# Look for the emission_models module directory and inspect __init__.py
find . -type d -name "emission_models" | grep -E "synthesizer" | head -3

Length of output: 0


🏁 Script executed:

#!/bin/bash
# Check if Greybody is exported at top-level by examining all import statements in emission_models __init__
find . -type d -name "emission_models" | grep synthesizer | head -1 | xargs -I {} find {} -name "__init__.py" | xargs cat

Length of output: 0


🏁 Script executed:

#!/bin/bash
# Alternative: search for where Greybody is defined
rg "^class Greybody" --type=py -C2

Length of output: 0






🏁 Script executed:

#!/bin/bash
# Check for package metadata to understand dependencies
ls -la | grep -E "setup|pyproject|requirements"
cat pyproject.toml 2>/dev/null | grep -A 20 "dependencies\|synthesizer" || echo "No pyproject.toml found"
cat setup.py 2>/dev/null | grep -i synthesizer || echo "No setup.py found"

Length of output: 1341


🏁 Script executed:

#!/bin/bash
# Search for any __init__.py files in the repository that might give clues
find . -name "__init__.py" -type f | grep -i emission | head -5

Length of output: 0


🌐 Web query:

synthesizer.emission_models Greybody import path documentation

💡 Result:

Use the Greybody class from the dust emission module. Example import:

from synthesizer.emission_models.dust.emission import Greybody. [1][2]

References:

  • Dust emission docs (shows Greybody usage and import). [1]
  • Generating photometry example (uses Greybody import). [2]

Fix the incorrect import path for Greybody.

The simplified import path from synthesizer.emission_models import Greybody is invalid. The correct import is from synthesizer.emission_models.dust.emission import Greybody, as confirmed by all example scripts in the codebase and official documentation. The notebook's current import will fail at runtime.

Update line 25 to use the full module path matching the examples:

from synthesizer.emission_models.dust.emission import Greybody  # noqa
🤖 Prompt for AI Agents
In docs/source/library_gen/complex_library_generation.ipynb at line 25, the
import statement for Greybody uses an incorrect and incomplete module path that
will cause a runtime import error. Replace the current simplified import path
with the full and correct module path that matches the actual location of the
Greybody class in the codebase. The correct path should navigate through the
dust subpackage to reach the emission module where Greybody is actually defined.

"from synthesizer.emission_models.attenuation import (\n",
" Calzetti2000,\n",
") # noqa\n",
"from synthesizer.emission_models.dust.emission import Greybody, IR_templates # noqa\n",
"from synthesizer.emission_models.stellar.pacman_model import (\n",
" PacmanEmission,\n",
") # noqa\n",
Expand Down
4 changes: 2 additions & 2 deletions docs/source/library_gen/library_generation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,9 @@ Library Generation

The principle of simulation-based inference relies on a 'simulator', which takes in a set of parameters and produces a synthetic observation. In traditional SED fitting, this is often a forward model that generates a synthetic SED from a set of physical parameters, and typically computed on-the-fly during inference. Whilst SBI models can be trained using such a simulator (called 'online' or 'active' learning), this is computationally wasteful if you intend to train multiple SBI models with the same simulator configuration.

synference therefore provides tools to generate a 'library' of pre-computed simulations, which can then be used to train multiple SBI models. This is particularly useful for computationally expensive simulators, where generating a large library of simulations can take hours or days, but training an SBI model on that library takes only minutes to hours.
Synference therefore provides tools to generate a 'library' of pre-computed simulations, which can then be used to train multiple SBI models. This is particularly useful for computationally expensive simulators, where generating a large library of simulations can take hours or days, but training an SBI model on that library takes only minutes to hours.

Normal use of synference uses the 'Synthesizer' package to generate synthetic observables from a set of input parameters. The Synthesizer package provides a high-level interface to a variety of underlying astrophysical simulation codes, and can generate a wide range of synthetic observables including photometric fluxes, spectra, and images. The full Synthesizer documentation can be found `here <https://synthesizer-project.github.io/synthesizer/>`_, but we provide a brief crash course below.
Normal use of Synference uses the 'Synthesizer' package to generate synthetic observables from a set of input parameters. The Synthesizer package provides a high-level interface to a variety of underlying astrophysical simulation codes, and can generate a wide range of synthetic observables including photometric fluxes, spectra, and images. The full Synthesizer documentation can be found `here <https://synthesizer-project.github.io/synthesizer/>`_, but we provide a brief crash course below.

.. toctree::
:maxdepth: 2
Expand Down
19 changes: 7 additions & 12 deletions docs/source/library_gen/synthesizer_crash_course.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -40,19 +40,14 @@
"source": [
"Grids are handled using the `synthesizer.Grid` class, where the `grid_dir` argument tells the code where the desired Grid lives.\n",
"\n",
"Here is some code to download the grids using the `synthesizer-download` command."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "3bd60ad0",
"metadata": {},
"outputs": [],
"source": [
"Here is some code to download the grids using the `synthesizer-download` command.\n",
"\n",
"```python\n",
"\n",
"import subprocess\n",
"\n",
"subprocess.Popen([\"synthesizer-download\", \"--test-grids\", \"--dust-grid\"])"
"subprocess.Popen([\"synthesizer-download\", \"--test-grids\", \"--dust-grid\"])`\n",
"```"
Comment on lines +43 to +50
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Remove stray backtick on line 49.

Line 49 has a trailing backtick that should be removed. The line should end with just a closing parenthesis.

Apply this diff:

-subprocess.Popen(["synthesizer-download", "--test-grids", "--dust-grid"])`
+subprocess.Popen(["synthesizer-download", "--test-grids", "--dust-grid"])
🤖 Prompt for AI Agents
In docs/source/library_gen/synthesizer_crash_course.ipynb around lines 43 to 50,
remove the stray trailing backtick at the end of line 49 so the subprocess.Popen
call ends with a closing parenthesis only; update the line to drop the final
backtick character and ensure the surrounding triple-backtick fenced code block
remains correct.

]
},
{
Expand Down Expand Up @@ -321,8 +316,8 @@
"metadata": {},
"outputs": [],
"source": [
"from synthesizer.emission_models import Blackbody\n",
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

Verify the simplified import path for Blackbody.

The import has been updated to use the top-level package path, consistent with similar changes in other notebooks. Ensure that Blackbody is exported from synthesizer.emission_models.


🏁 Script executed:

#!/bin/bash
# Check if Blackbody is available at the top-level emission_models module
rg -n "class Blackbody" --type=py -C2
rg -n "from.*Blackbody" --type=py -C2

Length of output: 1080



---

🏁 Script executed:

```shell
#!/bin/bash
# Check the synthesizer/emission_models module structure and re-exports
cat synthesizer/emission_models/__init__.py 2>/dev/null || echo "File not found"

# Also check if there's re-export in the dust submodule
cat synthesizer/emission_models/dust/__init__.py 2>/dev/null || echo "File not found"

# Find where Blackbody class is actually defined
rg -n "^class Blackbody" --type=py

Length of output: 29



---

🏁 Script executed:

```shell
#!/bin/bash
# Explore the synthesizer package structure
fd -t d "emission_models" 
fd -t f "__init__.py" | grep -E "emission_models|synthesizer" | head -20

# Search for Blackbody class more flexibly
rg -n "class Blackbody" --type=py

# Check the actual file at the nested path
cat synthesizer/emission_models/dust/emission.py 2>/dev/null | head -100

Length of output: 0



---

🏁 Script executed:

```shell
#!/bin/bash
# Check if the notebook file exists and inspect the import context
find . -name "synthesizer_crash_course.ipynb" -type f

# Search for any Blackbody references in notebook files
rg -n "Blackbody" --type=json | head -20

# Check if there are any __init__ files at all
find . -name "__init__.py" -path "*/synthesizer/*" | head -10

Length of output: 205



---

🏁 Script executed:

```shell
#!/bin/bash
# Extract and inspect the notebook cell at line 319 context
python3 << 'EOF'
import json

with open('./docs/source/library_gen/synthesizer_crash_course.ipynb', 'r') as f:
    notebook = json.load(f)
    
# Find cells containing imports around line 319
for i, cell in enumerate(notebook['cells']):
    if cell['cell_type'] == 'code':
        source = ''.join(cell['source'])
        if 'Blackbody' in source or 'emission_models' in source:
            print(f"Cell {i}:")
            print(source)
            print("\n---\n")
EOF

# Check if there's a setup.py or pyproject.toml that shows package structure
ls -la setup.py pyproject.toml 2>/dev/null || echo "No setup files found in root"

# Search for emission_models imports across all notebooks to find patterns
rg -n "from synthesizer" docs/source --type=json | grep -i "emission" | head -10

Length of output: 503



---

🏁 Script executed:

```shell
#!/bin/bash
# Check pyproject.toml for package configuration
cat pyproject.toml | head -50

# Search for all __init__.py files
find . -name "__init__.py" | head -20

# Look for emission_models module specifically
find . -type d -name "emission_models" | head -5

# Search for where TotalEmission is defined (to compare with Blackbody)
rg -n "class TotalEmission" --type=py -C2

Length of output: 1319



---

🌐 Web query:

```web
synthesizer python package Blackbody emission_models import

💡 Result:

Use the dust emission module. Example:

from synthesizer.emission_models.dust.emission import Blackbody

then create one, e.g.:

from unyt import K
bb = Blackbody(temperature=100 * K)

Sources: Synthesizer docs (dust emission examples). [1][2]


Correct the import path to match the official Synthesizer package structure.

The simplified import from synthesizer.emission_models import Blackbody is incorrect. The official Synthesizer documentation specifies: from synthesizer.emission_models.dust.emission import Blackbody, which aligns with the import patterns used in the repository's example scripts. Update the notebook cell 24 to use the correct nested import path.

🤖 Prompt for AI Agents
In docs/source/library_gen/synthesizer_crash_course.ipynb around line 319, the
import statement uses an incorrect path; replace the simplified import with the
official nested path: update the notebook cell (cell 24) to import Blackbody
from synthesizer.emission_models.dust.emission so it matches the package
structure and repository examples.

"from synthesizer.emission_models.attenuation import PowerLaw\n",
"from synthesizer.emission_models.dust.emission import Blackbody\n",
"\n",
"dust_curve = PowerLaw(slope=-0.7)\n",
"dust_emission_model = Blackbody(temperature=30 * Kelvin)"
Expand Down
2 changes: 1 addition & 1 deletion docs/source/noise_modelling/creating_noise_model.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Uncertainty Modelling
*********************

Here we detail the various ways to create noise models in Synference.
Here we detail the various ways to create noise models in Synference. Noise models are used to characterise the observational uncertainties on synthetic observables, and are an important component of simulation-based inference when applying it to real data.

.. toctree::
:maxdepth: 2
Expand Down
4 changes: 4 additions & 0 deletions docs/source/posterior_inference/intro.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,10 @@
Posterior Inference
*******************

In this section, we provide an overview of the various methods for performing posterior inference in Synference. Posterior inference is a crucial step in the simulation-based inference workflow, once an SBI model has been trained. It involves using the trained model to infer the posterior distribution of model parameters given observed data.

We can also use the trained SBI model alongside our simulator to recover the predicted observables for our posterior samples, allowing us to assess the quality of our inference, and assess how well our model is able to reproduce the observed data.

.. toctree::
:maxdepth: 2

Expand Down
2 changes: 1 addition & 1 deletion docs/source/posterior_inference/sed_recovery.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -115,7 +115,7 @@
"import numpy as np\n",
"from IPython.display import display\n",
"\n",
"path = \"../example_models/BPASS_DB_v4/sbi_test_data_BPASS_DenseBasis_v4_final.npz\"\n",
"path = f\"{test_data_dir}/sbi_test_data_BPASS_DenseBasis_v4_final.npz\"\n",
"data = np.load(path)\n",
"X_test = data[\"X\"]\n",
"y_test = data[\"y\"]"
Expand Down
2 changes: 2 additions & 0 deletions docs/source/sbi/introduction_to_sbi.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
Introduction to SBI
*********************

In this section, we provide an introduction to the background and theory of Simulation-Based Inference (SBI) methods, which form the core of the Synference package. We discuss the advantages of SBI over traditional inference methods, and outline the general workflow for performing SBI.

.. toctree::
:maxdepth: 2

Expand Down
Loading