diff --git a/intermediate/hierarchical_computation.ipynb b/intermediate/hierarchical_computation.ipynb index 17760d58..2c73a8d9 100644 --- a/intermediate/hierarchical_computation.ipynb +++ b/intermediate/hierarchical_computation.ipynb @@ -137,17 +137,21 @@ "source": [ "## Applying functions designed for `Dataset` with `map_over_datasets`\n", "\n", - "What if we wanted to convert the data to log-space? For a `Dataset` or `DataArray`, we could just use {py:func}`xarray.ufuncs.log`, but that does not support `DataTree` objects, yet:" + "What if we wanted to apply a element-wise function, for example to convert the data to log-space? For a `DataArray` we could just use {py:func}`numpy.log`, but this is not supported for `DataTree` objects:" ] }, { "cell_type": "code", "execution_count": null, "id": "12", - "metadata": {}, + "metadata": { + "tags": [ + "raises-exception" + ] + }, "outputs": [], "source": [ - "xr.ufuncs.log(tree)" + "np.log(tree)" ] }, { @@ -155,8 +159,6 @@ "id": "13", "metadata": {}, "source": [ - "Note how the result is a empty `Dataset`?\n", - "\n", "To map a function to all nodes, we can use {py:func}`xarray.map_over_datasets` and {py:meth}`xarray.DataTree.map_over_datasets`: " ] }, @@ -203,8 +205,7 @@ "id": "18", "metadata": { "tags": [ - "raises-exception", - "hide-output" + "raises-exception" ] }, "outputs": [], @@ -235,6 +236,78 @@ "\n", "tree.map_over_datasets(demean)" ] + }, + { + "cell_type": "markdown", + "id": "21", + "metadata": {}, + "source": [ + "## Escape hatches\n", + "\n", + "For some more complex operations, it might make sense to work on {py:class}`xarray.Dataset` or {py:class}`xarray.DataArray` objects and reassemble the tree afterwards.\n", + "\n", + "Let's look at a new dataset:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "22", + "metadata": {}, + "outputs": [], + "source": [ + "precipitation = xr.tutorial.open_datatree(\"precipitation.nc4\").load()\n", + "precipitation" + ] + }, + { + "cell_type": "markdown", + "id": "23", + "metadata": {}, + "source": [ + "Suppose we wanted to interpolate the observed precipitation to the modelled precipitation. We could use `map_over_datasets` for this, but we can also have a bit more control:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "24", + "metadata": {}, + "outputs": [], + "source": [ + "interpolated = xr.DataTree.from_dict(\n", + " {\n", + " \"/\": precipitation.ds,\n", + " \"/observed\": precipitation[\"/observed\"].ds.interp(\n", + " lat=precipitation[\"/reanalysis/lat\"],\n", + " lon=precipitation[\"/reanalysis/lon\"],\n", + " ),\n", + " \"/reanalysis\": precipitation[\"/reanalysis\"],\n", + " }\n", + ")\n", + "interpolated" + ] + }, + { + "cell_type": "markdown", + "id": "25", + "metadata": {}, + "source": [ + "::::{admonition} Exercise\n", + ":class: tip\n", + "Compute the difference between total observed and modelled precipitation, and plot the result.\n", + "\n", + ":::{admonition} Solution\n", + ":class: dropdown\n", + "\n", + "```python\n", + "total = precipitation.sum(dim=[\"lon\", \"lat\"])\n", + "difference = total[\"/observed/precipitation\"] - total[\"/reanalysis/precipitation\"]\n", + "difference.plot()\n", + "```\n", + ":::\n", + "::::\n" + ] } ], "metadata": {