-
Notifications
You must be signed in to change notification settings - Fork 51
[Diff] sbAsma/issue1279 noise conditioning #1358
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 8 commits
6b90581
07d45d6
8a91014
9d02a73
7701a9e
c65ad7b
ddc820d
9137b81
0db09af
18c63fa
45e94cd
5c2bb02
f9fcad5
bfeeacb
e006823
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,52 @@ | ||
| # Changes from original to new config | ||
| forecast_offset: 1 # changed from 0 | ||
| forecast_steps: 1 # changed from 0 | ||
| forecast_policy: "fixed" # changed from null | ||
| fe_num_blocks: 8 # changed from 0 | ||
| fe_diffusion_model: True # new parameter added | ||
|
|
||
| training_mode: "forecast" | ||
| "target_and_aux_calc": "DiffusionLatentTargetEncoder" | ||
| training_mode_config: { | ||
| "losses" : { | ||
| LossLatentDiffusion: { | ||
| weight: 1.0, loss_fcts: [['mse', 1.0]] | ||
| } | ||
| }, | ||
| "shared_heads": False, | ||
| "teacher_model": {} | ||
| } | ||
| validation_mode_config: { | ||
| "losses" : { | ||
| LossLatentDiffusion: { | ||
| weight: 1.0, loss_fcts: [['mse', 1.0]] | ||
| } | ||
| }, | ||
| "shared_heads": False, | ||
| "teacher_model": {} | ||
| } | ||
|
|
||
| # training_mode: "forecast" # changed from "masking" | ||
| # training_mode_config: | ||
| # losses: | ||
| # LossLatentDiffusion: # changed from LossPhysical | ||
| # weight: 1.0 | ||
| # loss_fcts: [['mse', 1.0]] | ||
| # shared_heads: False # new parameter | ||
| # target_and_aux_calc: "DiffusionLatentTargetEncoder" # new parameter | ||
| # teacher_model: {} # new parameter | ||
|
|
||
| # validation_mode_config: | ||
| # losses: | ||
| # LossLatentDiffusion: # changed from LossPhysical | ||
| # weight: 1.0 | ||
| # loss_fcts: [['mse', 1.0]] | ||
| # shared_heads: False # new parameter | ||
| # target_and_aux_calc: "DiffusionLatentTargetEncoder" # new parameter | ||
| # teacher_model: {} # new parameter | ||
|
|
||
| samples_per_mini_epoch: 32 # changed from 4096 | ||
| samples_per_validation: 8 # changed from 512 | ||
|
|
||
| # Updated comment for training_mode | ||
| # training mode: "forecast" or "masking" (masked token modeling) or "student-teacher" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -332,9 +332,15 @@ def create(self) -> "Model": | |
| "Empty forecast engine (fe_num_blocks = 0), but forecast_steps[i] > 0 for some i" | ||
| ) | ||
|
|
||
| self.forecast_engine = ForecastingEngine(cf, self.num_healpix_cells) | ||
| if cf.fe_diffusion_model: | ||
| self.forecast_engine = DiffusionForecastEngine(forecast_engine=self.forecast_engine) | ||
| # check if diffusion mode is enabled | ||
| fe_diffusion_model = getattr(cf, "fe_diffusion_model", False) | ||
| if fe_diffusion_model: | ||
| self.forecast_engine = DiffusionForecastEngine( | ||
| forecast_engine=ForecastingEngine(cf, self.num_healpix_cells) | ||
| ) | ||
| else: | ||
| self.forecast_engine = ForecastingEngine(cf, self.num_healpix_cells) | ||
|
|
||
|
||
|
|
||
| ############### | ||
| # embed coordinates yielding one query token for each target token | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's clean to have the diffusion config in a separate file. Not sure if this is needed tough at the moment. Maybe @MatKbauer has some thoughts?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, was thinking so too. Let's leave the config messy as of now during the diffusion forecast engine testing and later revert back to the original defaults (before we actually merge it into develop). @sbAsma, can you hold the changes to the
default_config.ymlback for now?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Concerning the separate file for the diffusion engine. This would help us prevent the
default_config.ymlgrowing incredibly large. On the other hand, we were concerned that with a nested config structure, we will have hard times to find certain parameters (as we would not be able to search for them withctrl+F5).