-
Notifications
You must be signed in to change notification settings - Fork 44
Feature/add spatial disaggregation recipe #57
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from 9 commits
104f1ac
e599e75
aa690d0
098d8ea
0b26592
78a9d17
16a83c1
97b1594
4edee63
de0bf29
b4e0262
458f292
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,2 @@ | ||
| from .regridding import apply_weights | ||
| from .sd import SpatialDisaggregator |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,8 @@ | ||
| import xesmf as xe | ||
|
|
||
|
|
||
| def apply_weights(regridder, input_data): | ||
| regridder._grid_in = None | ||
| regridder._grid_out = None | ||
| result = regridder(input_data) | ||
| return result |
| Original file line number | Diff line number | Diff line change | ||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| @@ -0,0 +1,124 @@ | ||||||||||||||||||
| import numpy as np | ||||||||||||||||||
| import pandas as pd | ||||||||||||||||||
| import xarray as xr | ||||||||||||||||||
|
|
||||||||||||||||||
|
|
||||||||||||||||||
| class SpatialDisaggregator: | ||||||||||||||||||
| """ | ||||||||||||||||||
| Spatial disaggregation model class | ||||||||||||||||||
| Apply spatial disaggregation algorithm to an xarray Dataset with fit | ||||||||||||||||||
| and predict methods using NASA-NEX method for spatial disaggregation | ||||||||||||||||||
| (see Thrasher et al, 2012). | ||||||||||||||||||
|
|
||||||||||||||||||
| Parameters | ||||||||||||||||||
| ---------- | ||||||||||||||||||
| var : str | ||||||||||||||||||
| specifies the variable being downscaled. Default is | ||||||||||||||||||
| temperature and other option is precipitation. | ||||||||||||||||||
| """ | ||||||||||||||||||
|
|
||||||||||||||||||
| def __init__(self, var='temperature'): | ||||||||||||||||||
| self._var = var | ||||||||||||||||||
|
|
||||||||||||||||||
| if var == 'temperature': | ||||||||||||||||||
| pass | ||||||||||||||||||
| elif var == 'precipitation': | ||||||||||||||||||
| pass | ||||||||||||||||||
| else: | ||||||||||||||||||
|
||||||||||||||||||
| if var == 'temperature': | |
| pass | |
| elif var == 'precipitation': | |
| pass | |
| else: | |
| if var not in ['temperature', 'precipitation'] : |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe something like this:
| if not np.array_equal(ds_bc[lat_name], climo_coarse[lat_name]): | |
| raise ValueError('climo latitude dimension does not match model res') | |
| if not np.array_equal(ds_bc[lon_name], climo_coarse[lon_name]): | |
| raise ValueError('climo longitude dimension does not match model res') | |
| if not ds_bc[lat_name].equals(climo_coarse[lat_name]): | |
| raise ValueError('climo latitude coordinate does not match model res') | |
| if not ds_bc[lon_name].equals(climo_coarse[lon_name]): | |
| raise ValueError('climo longitude coordinate does not match model res') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
minor nit but these are over-indented (same for all docstrings here)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is fixed now.