Conversation
|
@Radonirinaunimi this should correspond to the patch you had at some point. Could you please have a look? |
|
Thanks. Is this what you are using now for the plots you are creating? (if not, could you add that as well, so we have that in the repo) |
|
sure, I ll add here some vp functions to produce those plots in a hyperopt report |
| return fig | ||
|
|
||
| @figure | ||
| def plot_cumulative_loss(hyperopt_dataframe): |
| return fig | ||
|
|
||
| @figure | ||
| def plot_cumulative_logp_chi2(hyperopt_dataframe): |
| chi2_ = results['hyper_loss_chi2'].to_numpy() | ||
| chi2exp = results['trvl_loss_chi2exp'].to_numpy() | ||
|
|
||
| idx_ok = np.where(chi2exp<1.35) |
There was a problem hiding this comment.
Put this threshold as an argument (default to 1.35) but just so that it can be changed easily (and it is obvious that it is a threshold)
| chi2_ = results['hyper_loss_chi2'].to_numpy() | ||
|
|
||
| # don t look at samples with -logp or chi2 too big | ||
| idx_ok = np.where(chi2_<5.) |
There was a problem hiding this comment.
Put this threshold as an argument (default to 5) but just so that it can be changed easily (and it is obvious that it is a threshold)
| filters set in the commandline arguments. | ||
| """ | ||
| drop_keys = ["hyper_losses_chi2", "hyper_losses_phi2", "hyper_losses_logp"] | ||
| drop_keys += [f"layer_{idx}" for idx in range(1, 5)] |
There was a problem hiding this comment.
Perhaps the range should be larger just in case there are more layers. Perhaps even read how many layers we have.
Co-authored-by: Juan M. Cruz-Martinez <juacrumar@lairen.eu>
|
@scarlehoff if you are happy with this we can merge it |
Updating hyperoptplot for new hyperopt keys.