Skip to content

Commit e644586

Browse files
Merge pull request #31 from synthesizer-project/docs_update
Docs update
2 parents 0f19223 + 2c6ed6e commit e644586

File tree

4 files changed

+13
-13
lines changed

4 files changed

+13
-13
lines changed

docs/source/FAQ/FAQ.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -22,9 +22,9 @@ How do I deal with low sampling acceptance?
2222
-------------------------------------------
2323

2424
A: You will be warned if your sampling acceptance is low during inference. Low sampling acceptance means that the model is predicting posterior samples which are outside the prior proposal range. To deal with this, you can try the following:
25-
- Increase the prior proposal range to ensure that the true parameters are within the prior support. For some parameters this can be reasonable, for example setting a slightly wider range for stellar mass.
26-
- Train a better model - low sampling acceptance can indicate that the model is not accurately capturing the posterior distribution. You can try training a more complex model, or using more training data to improve the model's performance.
27-
- You can check if a specific parameter is causing low sampling acceptance by looking at the acceptance per parameter during inference using the custom torch prior implemented in Synference. You can enable this by running `SBI_Fitter.create_priors(debug_sample_acceptance=True)` before inference. This will log the acceptance rate for each parameter, allowing you to identify any parameters that may be causing issues.
25+
* Increase the prior proposal range to ensure that the true parameters are within the prior support. For some parameters this can be reasonable, for example setting a slightly wider range for stellar mass.
26+
* Train a better model - low sampling acceptance can indicate that the model is not accurately capturing the posterior distribution. You can try training a more complex model, or using more training data to improve the model's performance.
27+
* You can check if a specific parameter is causing low sampling acceptance by looking at the acceptance per parameter during inference using the custom torch prior implemented in Synference. You can enable this by running `SBI_Fitter.create_priors(debug_sample_acceptance=True)` before inference. This will log the acceptance rate for each parameter, allowing you to identify any parameters that may be causing issues.
2828

2929
How do I load a previously trained model for inference?
3030
-------------------------------------------------------

docs/source/library_gen/basic_library_generation.ipynb

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@
4242
"id": "11b7cf83",
4343
"metadata": {},
4444
"source": [
45-
"These arrays can be generated using the `draw_from_hypercube` function, which takes as input the number of samples you want to draw, and a dictionary of parameter ranges. In this dictionary, the key is the parameter range, and the value is a tuple defining the minimum and maximum value for that parameter. The function will then return a dictionary of arrays, where each array contains the sampled values for each parameter. \n",
45+
"These arrays can be generated using the `draw_from_hypercube` function, which takes as input the number of samples you want to draw, and a dictionary of parameter ranges. In this dictionary, the key is the parameter range, and the value is a tuple defining the minimum and maximum value for that parameter. The function will then return a dictionary of arrays, where each array contains the sampled values for each parameter. Other sampling schemes can be used (e.g. a [Sobol sequence](https://docs.scipy.org/doc/scipy/reference/generated/scipy.stats.qmc.Sobol.html))\n",
4646
"\n",
4747
"By default, the `draw_from_hypercube` function uses a form of low-discrepancy sampling called **Latin Hypercube Sampling (LHS)**. LHS is highly efficient compared to random sampling, especially when dealing with many parameters, as it ensures that the parameter space is sampled more evenly.\n",
4848
"\n",
@@ -311,7 +311,7 @@
311311
"id": "9ef59dba",
312312
"metadata": {},
313313
"source": [
314-
"You'll notice that above, unlike in the first tutorial, we didn't set the optical depth `tau_v` directy on the emission model. This is because we want it to vary between galaxies, so we will instead pass this attrivute to `GalaxyBasis` to set it on the individual `Galaxy` level."
314+
"You'll notice that above, unlike in the first tutorial, we didn't set the optical depth `tau_v` directy on the emission model. This is because we want it to vary between galaxies, so we will instead pass this attribute to `GalaxyBasis` to set it on the individual `Galaxy` level."
315315
]
316316
},
317317
{
@@ -396,7 +396,7 @@
396396
"id": "2aba55cb",
397397
"metadata": {},
398398
"source": [
399-
"Before we make our mock library, we also choose to save the `max_age` parameter. This is a derived parameter based on the redshift of the galaxy and is useful for later analysis so that we can easily reconstruct the original parameter space."
399+
"We also save the `max_age` parameter, which is a derived parameter based on the redshift of the galaxy. This is useful for later analysis so we can easily reconstruct the original parameter space."
400400
]
401401
},
402402
{
@@ -493,7 +493,7 @@
493493
"- Supplementary parameters array: An empty supplementary parameters array in this case, designed to store optional derived quantities such as star formation rates, or the surviving stellar mass\n",
494494
"- Model Group: This stores information about the emission model and instrument used, allowing us recreate the emission model and instrument later if we need to\n",
495495
"\n",
496-
"It's worth noting that the only required arrays are the 'parameters' and 'photometry' datasets. So you can entirely avoid using Synthesizer and build models externally using your code and method of choice, as long as you can produde a HDF5 array with the same simple format you will be able to use the SBI functioality of synference with your code. Please see the tutorial where we train a model from the outputs of the hydrodynamical simulation SPHINX for an example."
496+
"It's worth noting that the only required arrays are the 'parameters' and 'photometry' datasets. So you can entirely avoid using Synthesizer and build models externally using your code and method of choice, as long as you can produde a HDF5 array with the same simple format you will be able to use the SBI functionality of Synference with your code. Please see [this example](../library_gen/bring_your_own_library.ipynb), where we train a model from the outputs of the hydrodynamical simulation SPHINX."
497497
]
498498
},
499499
{
@@ -503,7 +503,7 @@
503503
"source": [
504504
"## Plotting a galaxy from our model\n",
505505
"\n",
506-
"Lastly, Synference has some debug methods to plot specific or random individual galaxy SEDs, photometry and star formation histories - `plot_galaxy` and `plot_random_galaxy`. Below we plot a random galaxy from the model. "
506+
"Synference has some debug methods to plot specific or random individual galaxy SEDs, photometry and star formation histories - `plot_galaxy` and `plot_random_galaxy`. Below we plot a random galaxy from the model. "
507507
]
508508
},
509509
{

docs/source/library_gen/complex_library_generation.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -71,7 +71,7 @@
7171
"id": "70cb723f",
7272
"metadata": {},
7373
"source": [
74-
"Firstly we'll set up our Synthesizer similarly to before - see the basic library generation example for more details.\n",
74+
"We'll set up Synthesizer similarly to before - see the basic library generation example for more details.\n",
7575
"\n",
7676
"For this example we'll use a set of filter used in wide area surveys, including VISTA, Subaru Hyper Suprime-Cam, Euclid, and Spitzer IRAC.\n",
7777
"\n",
@@ -408,7 +408,7 @@
408408
"id": "27812cf5",
409409
"metadata": {},
410410
"source": [
411-
"Now we can instantiate the GalaxyBasis, into which we will pass these inputs. This won't do much until we call the correct function to build the grid. Note that we set 'build_grid' = False, because we have already generated our full grid of parameters. If we wanted we could also pass in a smaller set of parameter values instead, and set build_grid=True, and the code would generate all the combinations of those parameters."
411+
"Now we can instantiate the GalaxyBasis, into which we will pass these inputs. This won't do much until we call the correct function to build the library. Note that we set 'build_library' = False, because we have already generated our full grid of parameters. If we wanted we could also pass in a smaller set of parameter values instead, and set build_library=True, and the code would generate all the combinations of those parameters."
412412
]
413413
},
414414
{
@@ -430,7 +430,7 @@
430430
" galaxy_params=galaxy_params,\n",
431431
" alt_parametrizations=alt_parametrizations,\n",
432432
" redshift_dependent_sfh=True,\n",
433-
" build_grid=False,\n",
433+
" build_library=False,\n",
434434
" log_stellar_masses=all_param_dict[\"log_masses\"],\n",
435435
")"
436436
]

docs/source/library_gen/synthesizer_crash_course.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -152,7 +152,7 @@
152152
"id": "a034c750",
153153
"metadata": {},
154154
"source": [
155-
"### 1. Define Star Formation History\n",
155+
"#### 1. Define Star Formation History\n",
156156
"\n",
157157
"Synthesizer includes several built-in star formation history (SFH) models, including commonly used models such as: \n",
158158
"- Constant\n",
@@ -508,7 +508,7 @@
508508
"\n",
509509
"This modularity provides the foundation for powerful high-level tools. For instance, the SBI-Fitters library generation tools build directly on this structure to efficiently create the large libraries of synthetic observables needed for simulation-based inference (SBI).\n",
510510
"\n",
511-
"To dive deeper into this application, refer to the next section on library generation."
511+
"The Synference library generation tools build on this framework to create large libraries of synthetic observables for use in simulation-based inference. You can learn more about library generation in the next section of the documentation. "
512512
]
513513
}
514514
],

0 commit comments

Comments
 (0)