-
Notifications
You must be signed in to change notification settings - Fork 2
Docs improvements #27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughDocumentation standardization across the project, including capitalization corrections of "Synference," typo fixes, and terminology updates from "grids" to "libraries." Import paths updated to use top-level package imports. Logging refactored in source code, and test fixture adjusted to use dynamic path resolution. Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~12 minutes
Poem
Pre-merge checks and finishing touches❌ Failed checks (1 inconclusive)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 4
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
docs/source/sbi_train/intro_sbi.rst (1)
9-15: All seven toctree-referenced documentation files are missing.The verification confirms that
docs/source/sbi_train/contains onlyintro_sbi.rst. None of the following files exist in the repository:
feature_array.rstbasic_sbi_model.rstcomplex_sbi_model.rstmodel_validation.rstmodel_optimization.rstonline_training.rstvalidation_sampling.rstThe toctree in
intro_sbi.rst(lines 9-15) references these non-existent files, which will break documentation navigation and likely cause Sphinx build failures. Either create these documentation files with appropriate content or remove the references from the toctree.
🧹 Nitpick comments (5)
src/synference/library.py (5)
5211-5213: Tweak log verbosity for filter updatesLogging the full filter code list at info level can be noisy. Prefer debug (or summarize count).
- logger.info(f"Updated filters: {self.instrument.filters.filter_codes}") + logger.debug(f"Updated filters: {len(self.instrument.filters.filter_codes)} filters")
5311-5317: Include exception context when cosmology load failsCapture and emit the original exception to aid debugging.
- except Exception: + except Exception as e: from astropy.cosmology import Planck18 as cosmo - logger.warning("Failed to load cosmology from HDF5. Using Planck18 instead.") + logger.warning( + "Failed to load cosmology from HDF5. Using Planck18 instead.", + exc_info=e, + )
5350-5354: Defensive import for dust emission generatorsThis submodule path may differ across synthesizer versions. Guard the import and produce a clear message if unavailable.
- import synthesizer.emission_models.generators.dust as dem + try: + import synthesizer.emission_models.generators.dust as dem + except ImportError as e: + logger.error( + "Dust emission generators module not found in synthesizer. " + "Ensure your synthesizer version provides " + "emission_models.generators.dust.", + exc_info=e, + ) + raise
5433-5435: Verify dropped dust-emission params aren’t requiredYou pop 'cmb_factor' and 'temperature_z' then ignore them (commented mapping). If downstream models expect 'cmb_heating' (or equivalent), this silently changes behavior.
- cmb = dust_emission_model_params.pop("cmb_factor", None) - dust_emission_model_params.pop("temperature_z", None) - # if cmb is not None: - # dust_emission_model_params["cmb_heating"] = cmb != 1 + cmb = dust_emission_model_params.pop("cmb_factor", None) + dust_emission_model_params.pop("temperature_z", None) + if cmb is not None and "cmb_heating" in inspect.signature(dust_emission_model).parameters: + dust_emission_model_params["cmb_heating"] = (cmb != 1)If not applicable, add a brief comment explaining why these params are intentionally dropped to avoid future regressions.
6198-6202: Lower verbosity of large-shape diagnosticsThese are useful but belong at debug to keep logs clean in normal runs.
- logger.info(f"Number of parameters: {nparams}") - logger.info(f"Number of observations: {nobs}") - logger.info(f"Num rows in parameter library: {self.parameter_grid.shape[1]}") - logger.info(f"Num rows in observation library: {self.observation_grid.shape[1]}") + logger.debug(f"Number of parameters: {nparams}") + logger.debug(f"Number of observations: {nobs}") + logger.debug(f"Num rows in parameter library: {self.parameter_grid.shape[1]}") + logger.debug(f"Num rows in observation library: {self.observation_grid.shape[1]}")
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (19)
README.md(5 hunks)docs/CONTRIBUTING.md(1 hunks)docs/source/advanced_topics/simformer.ipynb(2 hunks)docs/source/conf.py(1 hunks)docs/source/index.rst(3 hunks)docs/source/library_gen/basic_library_generation.ipynb(2 hunks)docs/source/library_gen/complex_library_generation.ipynb(1 hunks)docs/source/library_gen/library_generation.rst(1 hunks)docs/source/library_gen/synthesizer_crash_course.ipynb(2 hunks)docs/source/noise_modelling/creating_noise_model.rst(1 hunks)docs/source/posterior_inference/intro.rst(1 hunks)docs/source/posterior_inference/sed_recovery.ipynb(1 hunks)docs/source/sbi/introduction_to_sbi.rst(1 hunks)docs/source/sbi_train/basic_sbi_model.ipynb(3 hunks)docs/source/sbi_train/intro_sbi.rst(1 hunks)docs/source/sbi_train/model_validation.ipynb(1 hunks)src/synference/library.py(7 hunks)src/synference/sbi_runner.py(0 hunks)tests/conftest.py(1 hunks)
💤 Files with no reviewable changes (1)
- src/synference/sbi_runner.py
🧰 Additional context used
🪛 LanguageTool
README.md
[grammar] ~65-~65: The word “setup” is a noun. The verb is spelled with a space.
Context: ...el on it looks like this:
Firstly we setup the Synthesizer based model. More detai...
(NOUN_VERB_CONFUSION)
[uncategorized] ~65-~65: This expression is usually spelled with a hyphen.
Context: ...ooks like this:
Firstly we setup the Synthesizer based model. More details on how to set up th...
(BASED_HYPHEN)
[grammar] ~65-~65: Make sure that a verb is not missing between ‘no’ and ‘dust’. (Ellipsis of a verb is possible in a sentence that uses coordination.)
Context: ...ncluding Cloudy nebular emission but no dust reprocessing. The photometric filters used are commo...
(DT_NN_VBG)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (4)
- GitHub Check: build (3.11)
- GitHub Check: build (3.13)
- GitHub Check: build (3.10)
- GitHub Check: build (3.12)
🔇 Additional comments (20)
docs/source/advanced_topics/simformer.ipynb (1)
80-82: LGTM! Good documentation formatting improvement.The markdown code block syntax is correctly applied, which will properly display the Python example with syntax highlighting. The consolidation of the code example into a single, properly formatted block improves readability.
Also applies to: 101-102
src/synference/library.py (2)
5374-5376: Good, explicit error when emission model missingClear message, early failure is appropriate here.
5415-5416: Good, explicit error for missing dust emission modelConcise and actionable; OK as-is.
docs/source/conf.py (1)
83-83: LGTM!The capitalization of "Synference" in the HTML title correctly standardizes the project name across documentation.
docs/CONTRIBUTING.md (1)
97-98: LGTM!The capitalization standardization and added guidance sentence improve the contributor experience and align with the broader documentation consistency effort.
docs/source/sbi/introduction_to_sbi.rst (1)
4-5: LGTM!The introductory paragraph provides excellent context for readers new to Simulation-Based Inference and clearly outlines what this section covers.
docs/source/noise_modelling/creating_noise_model.rst (1)
4-4: LGTM!The expanded introduction clearly explains the purpose and importance of noise models in the simulation-based inference workflow.
docs/source/posterior_inference/intro.rst (1)
4-7: LGTM!The introductory paragraphs effectively explain posterior inference's role in the SBI workflow and how to use trained models for data assessment.
docs/source/posterior_inference/sed_recovery.ipynb (1)
118-118: LGTM!The dynamic path resolution using
test_data_diris a solid improvement over hardcoded paths, enhancing portability across different environments.docs/source/index.rst (2)
9-9: LGTM!The capitalization of "Synference" maintains naming consistency throughout the documentation.
32-32: LGTM!The proper capitalization of "Synference" and "Synthesizer" in the citation text standardizes project naming and improves professional presentation.
docs/source/sbi_train/intro_sbi.rst (1)
4-5: LGTM!The introductory paragraph clearly describes the training workflow and learning approaches (pre-computed simulations vs. online learning).
docs/source/sbi_train/basic_sbi_model.ipynb (2)
124-124: LGTM! Documentation and code are aligned.The narrative now accurately describes that only the first 10 entries will be printed, and the code correctly slices the array.
Also applies to: 134-134
152-152: LGTM! Consistent array slicing for output.The observation grid is correctly sliced to show only the first 10 entries along the second axis, consistent with the parameter array treatment.
docs/source/sbi_train/model_validation.ipynb (2)
20-36: LGTM! Good documentation structure.The conversion to markdown with an embedded code example effectively documents the SNR-based validation splitting feature.
38-60: LGTM! Comprehensive validation function reference.The enumerated list of 13 validation functions provides a helpful overview for users. The descriptions are concise and informative.
README.md (1)
1-168: LGTM! Consistent capitalization and typo fixes.The README now consistently uses "Synference" as the proper noun throughout, and typos like "separated" and "independent" have been corrected. These changes improve the professional presentation of the documentation.
docs/source/library_gen/library_generation.rst (1)
6-8: LGTM! Consistent capitalization.The project name is now properly capitalized as "Synference" throughout the documentation.
docs/source/library_gen/basic_library_generation.ipynb (2)
431-432: LGTM! Improved terminology.The change from "grids" to "libraries" better reflects the conceptual model of collections of pre-computed simulations.
456-458:library_folderis properly exported from the synference package.The verification confirms that
library_folderis defined insrc/synference/library.py, imported insrc/synference/__init__.py, and included in the__all__list for public export. The notebook import is correct.
| "source": [ | ||
| "import numpy as np\n", | ||
| "from astropy.cosmology import Planck18 as cosmo\n", | ||
| "from synthesizer.emission_models import Greybody # noqa\n", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
Verify the simplified import path for Greybody.
The import has been updated to use the top-level package path. Ensure that Greybody is exported from synthesizer.emission_models and this import path is valid.
🏁 Script executed:
#!/bin/bash
# Check if Greybody is available at the top-level emission_models module
rg -n "class Greybody" --type=py -C2
rg -n "from.*Greybody" --type=py -C2Length of output: 3861
🏁 Script executed:
#!/bin/bash
# Check the emission_models __init__.py to see if Greybody is re-exported
find . -path "*/emission_models/__init__.py" -type f | head -5
rg -n "Greybody" --type=py $(find . -path "*/emission_models/__init__.py" -type f | head -1) -C3Length of output: 10609
🏁 Script executed:
#!/bin/bash
# Also check if there's an __all__ export list in emission_models
rg -n "__all__" --type=py --glob="*/emission_models/__init__.py" -C5Length of output: 148
🏁 Script executed:
#!/bin/bash
# Check the notebook content directly to confirm the import statement
cat docs/source/library_gen/complex_library_generation.ipynb | head -100Length of output: 3219
---
🏁 Script executed:
```shell
#!/bin/bash
# Find the actual emission_models __init__.py file in synthesizer package
fd "emission_models" --type d | head -10
Length of output: 0
🏁 Script executed:
#!/bin/bash
# Look for the emission_models module directory and inspect __init__.py
find . -type d -name "emission_models" | grep -E "synthesizer" | head -3Length of output: 0
🏁 Script executed:
#!/bin/bash
# Check if Greybody is exported at top-level by examining all import statements in emission_models __init__
find . -type d -name "emission_models" | grep synthesizer | head -1 | xargs -I {} find {} -name "__init__.py" | xargs catLength of output: 0
🏁 Script executed:
#!/bin/bash
# Alternative: search for where Greybody is defined
rg "^class Greybody" --type=py -C2Length of output: 0
🏁 Script executed:
#!/bin/bash
# Check for package metadata to understand dependencies
ls -la | grep -E "setup|pyproject|requirements"
cat pyproject.toml 2>/dev/null | grep -A 20 "dependencies\|synthesizer" || echo "No pyproject.toml found"
cat setup.py 2>/dev/null | grep -i synthesizer || echo "No setup.py found"Length of output: 1341
🏁 Script executed:
#!/bin/bash
# Search for any __init__.py files in the repository that might give clues
find . -name "__init__.py" -type f | grep -i emission | head -5Length of output: 0
🌐 Web query:
synthesizer.emission_models Greybody import path documentation
💡 Result:
Use the Greybody class from the dust emission module. Example import:
from synthesizer.emission_models.dust.emission import Greybody. [1][2]
References:
- Dust emission docs (shows Greybody usage and import). [1]
- Generating photometry example (uses Greybody import). [2]
Fix the incorrect import path for Greybody.
The simplified import path from synthesizer.emission_models import Greybody is invalid. The correct import is from synthesizer.emission_models.dust.emission import Greybody, as confirmed by all example scripts in the codebase and official documentation. The notebook's current import will fail at runtime.
Update line 25 to use the full module path matching the examples:
from synthesizer.emission_models.dust.emission import Greybody # noqa🤖 Prompt for AI Agents
In docs/source/library_gen/complex_library_generation.ipynb at line 25, the
import statement for Greybody uses an incorrect and incomplete module path that
will cause a runtime import error. Replace the current simplified import path
with the full and correct module path that matches the actual location of the
Greybody class in the codebase. The correct path should navigate through the
dust subpackage to reach the emission module where Greybody is actually defined.
| "Here is some code to download the grids using the `synthesizer-download` command.\n", | ||
| "\n", | ||
| "```python\n", | ||
| "\n", | ||
| "import subprocess\n", | ||
| "\n", | ||
| "subprocess.Popen([\"synthesizer-download\", \"--test-grids\", \"--dust-grid\"])" | ||
| "subprocess.Popen([\"synthesizer-download\", \"--test-grids\", \"--dust-grid\"])`\n", | ||
| "```" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove stray backtick on line 49.
Line 49 has a trailing backtick that should be removed. The line should end with just a closing parenthesis.
Apply this diff:
-subprocess.Popen(["synthesizer-download", "--test-grids", "--dust-grid"])`
+subprocess.Popen(["synthesizer-download", "--test-grids", "--dust-grid"])🤖 Prompt for AI Agents
In docs/source/library_gen/synthesizer_crash_course.ipynb around lines 43 to 50,
remove the stray trailing backtick at the end of line 49 so the subprocess.Popen
call ends with a closing parenthesis only; update the line to drop the final
backtick character and ensure the surrounding triple-backtick fenced code block
remains correct.
| "metadata": {}, | ||
| "outputs": [], | ||
| "source": [ | ||
| "from synthesizer.emission_models import Blackbody\n", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
Verify the simplified import path for Blackbody.
The import has been updated to use the top-level package path, consistent with similar changes in other notebooks. Ensure that Blackbody is exported from synthesizer.emission_models.
🏁 Script executed:
#!/bin/bash
# Check if Blackbody is available at the top-level emission_models module
rg -n "class Blackbody" --type=py -C2
rg -n "from.*Blackbody" --type=py -C2Length of output: 1080
---
🏁 Script executed:
```shell
#!/bin/bash
# Check the synthesizer/emission_models module structure and re-exports
cat synthesizer/emission_models/__init__.py 2>/dev/null || echo "File not found"
# Also check if there's re-export in the dust submodule
cat synthesizer/emission_models/dust/__init__.py 2>/dev/null || echo "File not found"
# Find where Blackbody class is actually defined
rg -n "^class Blackbody" --type=py
Length of output: 29
---
🏁 Script executed:
```shell
#!/bin/bash
# Explore the synthesizer package structure
fd -t d "emission_models"
fd -t f "__init__.py" | grep -E "emission_models|synthesizer" | head -20
# Search for Blackbody class more flexibly
rg -n "class Blackbody" --type=py
# Check the actual file at the nested path
cat synthesizer/emission_models/dust/emission.py 2>/dev/null | head -100
Length of output: 0
---
🏁 Script executed:
```shell
#!/bin/bash
# Check if the notebook file exists and inspect the import context
find . -name "synthesizer_crash_course.ipynb" -type f
# Search for any Blackbody references in notebook files
rg -n "Blackbody" --type=json | head -20
# Check if there are any __init__ files at all
find . -name "__init__.py" -path "*/synthesizer/*" | head -10
Length of output: 205
---
🏁 Script executed:
```shell
#!/bin/bash
# Extract and inspect the notebook cell at line 319 context
python3 << 'EOF'
import json
with open('./docs/source/library_gen/synthesizer_crash_course.ipynb', 'r') as f:
notebook = json.load(f)
# Find cells containing imports around line 319
for i, cell in enumerate(notebook['cells']):
if cell['cell_type'] == 'code':
source = ''.join(cell['source'])
if 'Blackbody' in source or 'emission_models' in source:
print(f"Cell {i}:")
print(source)
print("\n---\n")
EOF
# Check if there's a setup.py or pyproject.toml that shows package structure
ls -la setup.py pyproject.toml 2>/dev/null || echo "No setup files found in root"
# Search for emission_models imports across all notebooks to find patterns
rg -n "from synthesizer" docs/source --type=json | grep -i "emission" | head -10
Length of output: 503
---
🏁 Script executed:
```shell
#!/bin/bash
# Check pyproject.toml for package configuration
cat pyproject.toml | head -50
# Search for all __init__.py files
find . -name "__init__.py" | head -20
# Look for emission_models module specifically
find . -type d -name "emission_models" | head -5
# Search for where TotalEmission is defined (to compare with Blackbody)
rg -n "class TotalEmission" --type=py -C2
Length of output: 1319
---
🌐 Web query:
```web
synthesizer python package Blackbody emission_models import
💡 Result:
Use the dust emission module. Example:
from synthesizer.emission_models.dust.emission import Blackbody
then create one, e.g.:
from unyt import K
bb = Blackbody(temperature=100 * K)
Sources: Synthesizer docs (dust emission examples). [1][2]
Correct the import path to match the official Synthesizer package structure.
The simplified import from synthesizer.emission_models import Blackbody is incorrect. The official Synthesizer documentation specifies: from synthesizer.emission_models.dust.emission import Blackbody, which aligns with the import patterns used in the repository's example scripts. Update the notebook cell 24 to use the correct nested import path.
🤖 Prompt for AI Agents
In docs/source/library_gen/synthesizer_crash_course.ipynb around line 319, the
import statement uses an incorrect path; replace the simplified import with the
official nested path: update the notebook cell (cell 24) to import Blackbody
from synthesizer.emission_models.dust.emission so it matches the package
structure and repository examples.
| from synthesizer import get_grids_dir | ||
|
|
||
| synthesizer_grid_dir = get_grids_dir() | ||
| os.environ["SYNTHESIZER_GRID_DIR"] = str(synthesizer_grid_dir) | ||
| return synthesizer_grid_dir |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
Verify that get_grids_dir() returns the expected path for tests.
The fixture now uses dynamic path resolution via get_grids_dir() instead of a hardcoded path. Ensure that this function is available in the synthesizer package and returns a path that contains the test grids required by the test suite.
🏁 Script executed:
#!/bin/bash
# Verify get_grids_dir is available and check its usage in tests
rg -n "get_grids_dir" --type=py -C3Length of output: 1337
---
🏁 Script executed:
```shell
#!/bin/bash
# Find where get_grids_dir is defined in the synthesizer package
rg -n "def get_grids_dir" --type=py
Length of output: 0
🏁 Script executed:
#!/bin/bash
# Also check for any imports or definitions of synthesizer package
rg -n "synthesizer" tests/conftest.py -A5 -B5Length of output: 1824
🏁 Script executed:
#!/bin/bash
# Find requirements or setup files to identify synthesizer version
fd -type f \( -name "requirements*.txt" -o -name "setup.py" -o -name "pyproject.toml" -o -name "setup.cfg" \)Length of output: 170
🏁 Script executed:
#!/bin/bash
# Check what's in requirements files for synthesizer version
cat pyproject.toml | grep -A10 -B2 "synthesizer" || cat setup.py | grep -A5 "synthesizer" || grep "synthesizer" requirements*.txt 2>/dev/nullLength of output: 694
---
🌐 Web query:
```web
cosmos-synthesizer package get_grids_dir function
💡 Result:
I can't find a reference to a get_grids_dir function in the public cosmos-synthesizer / Synthesizer docs or code from quick searches. Do you want me to:
- search a specific GitHub repo/branch for that function, or
- explain how Synthesizer locates its grids (environment variables: SYNTHESIZER_DIR / SYNTHESIZER_DATA_DIR and related docs), or
- look at your local installation/code to find the function?
Tell me which and I'll search or inspect accordingly.
---
🌐 Web query:
```web
synthesizer-project/synthesizer github get_grids_dir
💡 Result:
I couldn't find a symbol named get_grids_dir in the repo, but Synthesizer stores/loads precomputed grids from the Synthesizer data directory (controlled by SYNTHESIZER_DIR / SYNTHESIZER_DATA_DIR). Look in your install's data directory (by default created when you first import or run synthesizer-init) — grids live under that data directory (e.g. <SYNTHESIZER_DATA_DIR>/grids). See the docs for grids and the environment-variable configuration. [1][2][3]
Sources:
- Grids documentation (where grids location/usage is described). [1]
- Installation / environment variables (SYNTHESIZER_DIR, SYNTHESIZER_DATA_DIR). [2]
- Repository README (getting grids / data server). [3]
Critical: get_grids_dir() does not exist in cosmos-synthesizer public API and will cause runtime ImportError.
Web search of the synthesizer-project repository confirms get_grids_dir() is not present in the public API. The cosmos-synthesizer package manages grids via environment variables (SYNTHESIZER_DIR/SYNTHESIZER_DATA_DIR) instead. The fixture will fail at runtime when attempting to import this non-existent function. Replace the import and function call with the documented environment variable approach or direct path resolution used by synthesizer.
🤖 Prompt for AI Agents
In tests/conftest.py around lines 43-47, the code imports and calls
get_grids_dir(), which is not part of the public cosmos-synthesizer API and will
raise ImportError; remove the import and call and instead set
SYNTHESIZER_GRID_DIR by resolving the grids path directly (for example: locate
the test fixtures/grids directory relative to this repo or use
importlib.resources / package data lookup to find the installed synthesizer
grids path), assign os.environ["SYNTHESIZER_GRID_DIR"] to that resolved path
(stringified) and return it; ensure you use the documented environment variable
approach (SYNTHESIZER_DIR / SYNTHESIZER_DATA_DIR or SYNTHESIZER_GRID_DIR per
project conventions) so the fixture no longer depends on the non-existent
get_grids_dir().
Summary by CodeRabbit
Documentation
Bug Fixes
Style