Skip to content

Conversation

@brian-eaton
Copy link
Collaborator

Fixes #220 Add MPAS-A dycore

brian-eaton and others added 30 commits October 21, 2019 16:15
Towards the goal of letting the dynamics import and export states point
directly to memory that is "owned" by the MPAS-Atmosphere dycore, this
commit simply allocates the dynamics import and export states in dyn_init
and sets the export state to a resting standard atmosphere in the d_p_coupling
routine.

Changes to the d_p_coupling routine were necessary to support non-local
d-p maps, and with these changes, a CAM-MPAS simulation now takes time steps
without crashing. Each time step simply provides the physics with the same
simple dynamics export state.
…ools

Previously, the memory for fields in the dynamics import and export state was
locally allocated by CAM. Now, the members of the import and export state point
to arrays that are allocated and managed in pools by MPAS-Atmosphere.

The changes to CAM-MPAS are supported by a new MPAS tag, v6.1-cam.013, that
defines new pools (dyn_in and dyn_out) in the MPAS-Atmosphere Registry file.
This commit updates the CIME tag to mpas.007-cime5.8.12 and adds the MPAS-A
60, 30, 15, and 15-3 km meshes.

Additionally, all MPAS externals have been udpated to the v6.1-cam.013 tag.
Previously, only the 'mpas_core' external had been updated to this tag, while
other MPAS externals were still pulling from the v6.1-cam.012 tag.
…ostic vars

The CAM-MPAS dynamics import and export states were previously defined in terms
of variables that were not directly prognosed by the MPAS-Atmosphere dycore.

This commit updates the MPAS-Atmosphere externals to tag v6.1-cam.014, which
removes the old pools for import/export state in the MPAS Registry and
which defines new fields to hold the total tendencies from physics for dycore-
prognosed variables.

To match the new definitions of the 'dyn_import_t' and 'dyn_export_t' types,
the d_p_coupling routine has been modified to use the new members of
dyn_export_t, but only in the case that local_dp_map is .false..
The p_d_coupling routine has not yet been updated to the new 'dyn_import_t'
members.

The code that had been added in d_p_coupling to set a fake dycore state has
been moved to the read_inidat routine.

Lastly, there are still aspects of the use of the new dycore import and export
states that need to be corrected. At present, the vertical coordinate metric
d(zeta)/dz (in the array zz) is not valid, so the 'rho_zz' field cannot be
computed correctly in cases where there is terrain in the model simulation,
and the interpolation weight fields 'fzm' and 'fzp', used to interpolate from
layer midpoints to layer interfaces, do not contain valid values.
The setup of all time-invariant fields -- including the vertical coordinate
surfaces, vertical coordinate metrics, and vertical interpolation weights --
now happens in the dyn_grid_init routine.

Because the vertical coordinate surfaces in MPAS-A are a function of
the terrain, and the vertical coordinate surfaces are read from the initial
file, there is no longer a need to read a separate terrain field, and so
the read_phis routine has been deleted.

Also, the reading of horizontal grid coordinates has been combined with
the reading of other time-invariant fields, and so the cam_mpas_read_geometry
routine has also been deleted.
The cam_mpas_read_static routine reads all time-invariant fields needed by
the MPAS-A dycore, but it did not update the halos for horizontally
decomposed fields. Now, the halos are updated for all time-invariant fields
in the cam_mpas_read_static routine.
The zz field is needed in the computation of the MPAS-A dycore prognostic
state, and accordingly, this field has been added to the dynamics import state.
The MPAS-A dycore import and export states now contain additional time-invariant
fields -- zz, fzm, fzp, and index_qv. These additional fields are used in
the dynamics-physics coupling to derive physics state from fields that are part
of the dycore prognostic state, for example, computing rho directly from rho_zz
and zz, and to correctly interpolate from layer midpoints to layer interfaces.
This merge adds more time-invariant fields to the MPAS-A dycore state,
and it also ensures that all time-invariant fields have filled-out halo
elements before the call to read_inidat. Also, changes have been made
in the dp_coupling routines to preferentially use fields that are part
of the dycore prognostic state rather than those that are diagnosed by
the dycore (e.g., preferring rho_zz rather than rho).
Update the MPAS externals to a new tag, v6.1-cam.015, which will permit changes
in subsequent commits to make use of a routine in the mpas_stream_mananger
module that was previously private.
The mesh indexing fields read by the cam_mpas_read_static routine reference
global indices in the MPAS mesh file, but when fields are partitioned across
MPI ranks, what is needed are local indices. The cam_mpas_read_static routine
now invokes code from the MPAS infrastructure to convert global indices to
local indices in the cellsOnCell, edgesOnCell, cellsOnEdge, etc. arrays.
In preparation for future code to project cell-centered wind vectors to
cell edges, compute the local north and east unit vectors at cell centers
and the unit normal vector at cell edges
For any future development that requires the halo elements of MPAS dycore fields
to be updated, this commit adds new code callable from CAM dycore interface
routines to update the halo for specified MPAS fields.
This commit adds a new routine to the cam_mpas_subdriver module to be used in
setting up initial conditions for horizontal velocity and in transforming
horizontal velocity tendencies from physics to dynamics. The routine projects
cell-centered wind vectors onto the unit normal vectors at edges. In support of
the use of this routine, several new variables have been added to the dynamics
import and export states.
…tate

This commit modifies which pressure fields (if any) are stored in the dycore
import and export states, and it adds a new routine for computing dry
hydrostatic pressure from the MPAS export state. Now, there are no longer any
pressure fields in dynamics import state, and dynamics export state contains
pmiddry and pintdry, which are computed at the beginning of d_p_couping by
a new routine dry_hydrostatic_pressure.
@cacraigucar
Copy link
Collaborator

I have checked the cesm testing database and don't find any mention of WACCMX baseline failures due to changes between cime5.8.28 (cesm2_2_alpha06c) and cime5.8.32 (cesm2_2_alpha06f). There is mention of diffs in FXHIST, but that seems to be due to a fix in cam. I'm seeing diffs in FXHIST, FXSD, and FX2000. @cacraigucar, do you think I should run the regression with the update just to cime5.8.32 just to be sure?

@brian-eaton Yes, I would be running the tests to just confirm for myself that the baselines remain the same using cime5.8.32 for CAM regression testing.

@cacraigucar
Copy link
Collaborator

@brian-eaton Also, if you do see answer changes, save that testing as we might want to squeeze in an intermediate CAM tag that just updates cime to cime5.8.32. Of course we would still need to understand why WACCMX has answer changes before doing so.

@fvitt
Copy link
Collaborator

fvitt commented Nov 5, 2020

I see the ESMF lib version has changed between cime5.8.28 and cime5.8.32 (esmf-8.1.0b17 --> esmf-8.1.0b23) for cheyenne. This could conceivably explain the answers changes in WACCMX tests. I believe cime5.8.32 is what we are using in the CESM2.2.0 release.

@cacraigucar
Copy link
Collaborator

@brian-eaton This is sounding like the update of the CAM externals from cime5.8.28 to cime5.8.32 may be the culprit. If that is confirmed by you, please open a PR for updating cime and you may go ahead with making cam6_3_003 with that one change.

@brian-eaton brian-eaton mentioned this pull request Nov 9, 2020
Copy link
Collaborator

@fvitt fvitt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am curious to know if there is anything to prevent someone from trying to run cam physics with MPAS dycore. The test cases you added seem to be only with simplified physics packages.

Copy link
Collaborator

@nusbaume nusbaume left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the fixes! It all looks good to me now. I do have one last question, but whatever the answer is it shouldn't be important enough to hold-up this PR.

!
! Set startTimeStamp based on the start time of the simulation clock
!
startTime = mpas_get_clock_time(clock, MPAS_START_TIME, ierr)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this ierr value need checked here?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Check added.

Copy link
Collaborator

@jtruesdal jtruesdal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good. Thanks for including the FV3 fix.

@brian-eaton
Copy link
Collaborator Author

I am curious to know if there is anything to prevent someone from trying to run cam physics with MPAS dycore. The test cases you added seem to be only with simplified physics packages.

No. Nor do I think there should be. That would just get in the way of the developers who are currently working on the full physics functionality.

Copy link
Collaborator

@gold2718 gold2718 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have a question about get_hdim_name but if that works in all cases, everything else looks good. Thanks for all the fixes!

Comment on lines 914 to 918
ierr = pio_inq_dimid(fh_ini, 'ncol', ncol_did)
if (ierr == PIO_NOERR) then

ini_grid_hdim_name = 'ncol'

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does it always work to try ncol first? Can there be an initial data file with the dynamics state on the GLL grid but that also has physics fields (e.g., the ones read from fh_ini in phys_inidat?
Does this algorithm do the right thing in that case?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good call. Although the "optional" physics fields should no longer be written to the initial file, the code to do so is still in place and could be turned on. Switching the order so that ncol_d is queried before ncol would correctly deal with that. I'll make that change.

@brian-eaton brian-eaton merged commit d82fc45 into ESCOMP:cam_development Nov 19, 2020
@cacraigucar cacraigucar changed the title Add MPAS-A dycore cam6_3_004: Add MPAS-A dycore Nov 19, 2020
gold2718 added a commit to gold2718/CAM that referenced this pull request Jan 8, 2026
cam_noresm2_3_v1.0.0i: Bug fixes and NorESM2.3 tasks

Summary: Fix bugs found in testing and also complete other NorESM2.3 tasks
Contributors: @gold2718, @DirkOlivie 
Reviewers: @DirkOlivie 
Purpose of changes: 
- NorESMhub#139 
- NorESMhub#113

Github PR URL: NorESMhub#246
Changes made to build system: None
Changes made to the namelist: None
Changes to the defaults for the boundary datasets: None
Substantial timing or memory changes: None

Testing: aux_cam_noresm and prealpha_noresm
All pass except for expected namelist and baseline changes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

10 participants