-
Notifications
You must be signed in to change notification settings - Fork 1
More xarray, docs, and tests #29
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov Report
@@ Coverage Diff @@
## main #29 +/- ##
==========================================
+ Coverage 93.12% 93.88% +0.76%
==========================================
Files 4 4
Lines 160 180 +20
==========================================
+ Hits 149 169 +20
Misses 11 11
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
malmans2
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- No more loops over
z - Documentation in good shape
- A couple of additional tests
- We're now fully checking types (before
xarrayobjects were replaced byAny). All functions arguments and returns have a type annotation. From now on mypy should be more helpful than annoying. If you add type annotation as soon as you define new functions, it should be easy to know if we have to make changes upstream
| ldbletanh: bool, optional | ||
| Logical flag to switch ON/OFF the double tanh stretching function. | ||
| This flag is only needed for compatibility with NEMO DOMAINcfg tools. | ||
| Just set ``ppa2``, ``ppkth2``, and ``ppacr2`` to switch ON | ||
| the double tanh stretching function. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this flag is redundant. I suggest we keep it just for backward compatibility with NEMO tools.
| ppa1_out = (aa / (bb - cc * (dd - ee))) if _is_nemo_none(ppa1) else ppa1 | ||
| ppa0_out = (self._ppdzmin - ppa1_out * bb) if _is_nemo_none(ppa0) else ppa0 | ||
| ppsur_out = ( | ||
| -(ppa0_out + ppa1_out * self._ppacr * ee) if _is_nemo_none(ppsur) else ppsur | ||
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just helping out mypy...
| # Errors | ||
| if ldbletanh is True and any(pp_are_none): | ||
| raise ValueError(f"{prefix_msg} MUST be all float when ldbletanh is True") | ||
| if ldbletanh is None and (any(pp_are_none) and not all(pp_are_none)): | ||
| raise ValueError(f"{prefix_msg} MUST be all None or float") | ||
|
|
||
| # Warning | ||
| if ldbletanh is False and not all(pp_are_none): | ||
| warnings.warn(f"{prefix_msg} are ignored when ldbletanh is False") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does this look right? Before we were doing a more strict check (pp>0 rather than not null), but I think it makes more sense to default None and the user can fully play with the underlying equation. If you think we should restore pp>0 when ldbletanh is True, then maybe we should introduce more checks on all parameters (e.g., ppdzmin >= 0, ...)
| # Initialise a dataset with empty z3 and e3 dataarrays | ||
| da = xr.full_like(ds["Bathymetry"], None).expand_dims(z=range(self._jpk)) | ||
| for v, g in product(var, grd): | ||
| ds[v + g] = da.copy() | ||
| ds = ds.set_coords(v + g) | ||
| return ds |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We don't need to initialize the dataset anymore. If you want to start Sco with a similar approach (i.e., start with loops, then remove them) you'd have to re-introduce this.
| def sea_mount(self, depth: float, stiff: float = 1) -> Dataset: | ||
| """ | ||
| Channel with seamount case. | ||
| Produces bathymetry of a channel with a Gaussian seamount in order to | ||
| simulate an idealised test case. Based on Marsaleix et al., 2009 | ||
| doi:10.1016/j.ocemod.2009.06.011 Eq. 15. | ||
| Parameters | ||
| ---------- | ||
| depth: float | ||
| Bottom depth (units: m). | ||
| stiff: float | ||
| Scale factor for steepness of seamount (units: None) | ||
| Returns | ||
| ------- | ||
| Dataset | ||
| """ | ||
| ds = self._coords | ||
|
|
||
| # Find half way point for sea mount location | ||
| half_way = {k: v // 2 for k, v in ds.sizes.items()} | ||
| glamt_mid, gphit_mid = (g.isel(half_way) for g in (ds.glamt, ds.gphit)) | ||
|
|
||
| # Define sea mount bathymetry | ||
| ds["Bathymetry"] = depth * ( | ||
| 1.0 | ||
| - 0.9 | ||
| * np.exp( | ||
| -( | ||
| stiff | ||
| / 40.0e3 ** 2 | ||
| * ((ds.glamt - glamt_mid) ** 2 + (ds.gphit - gphit_mid) ** 2) | ||
| ) | ||
| ) | ||
| ) | ||
|
|
||
| # Add rmax of Bathymetry | ||
| # ds["rmax"] = DataArray( | ||
| # _calc_rmax(ds["Bathymetry"].to_masked_array()), dims=["y", "x"] | ||
| # ) | ||
| ds["rmax"] = _calc_rmax(ds["Bathymetry"]) | ||
|
|
||
| return _add_attributes(_add_mask(ds)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry, I did something pretty bad here.
This was actually introduced by @jdha.
I merged main and committed locally in a single commit, so GH is not realizing that it's the same code (I just added the type annotation).
|
|
||
| both_rmax.append(np.abs(rmax)) | ||
|
|
||
| return np.maximum(*both_rmax) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same, this is James latest PR
|
Sorry about the mess... Forget about this PR. #31 is the right one! |
xarrayauto-broadcasting #19ZcoandZgrclasses to public API #25pre-commit run --all-filesapi.rst