Skip to content

Commit 7c6e141

Browse files
authored
Merge pull request #10 from CU-ESIIL/codex/rewrite-all-docs-for-cubedynamics
docs: rewrite site for CubeDynamics
2 parents 351a9bd + d0b0940 commit 7c6e141

15 files changed

+434
-158
lines changed

CITATION.cff

Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
cff-version: 1.2.0
2+
title: "CubeDynamics: Streaming Climate Cube Math for Environmental Data Science"
3+
message: "If you use CubeDynamics, please cite it as described here."
4+
type: software
5+
authors:
6+
- family-names: Tuff
7+
given-names: Ty
8+
version: 0.1.0
9+
repository-code: "https://github.com/CU-ESIIL/climate_cube_math"
10+
keywords:
11+
- climate
12+
- streaming
13+
- data cubes
14+
- xarray
15+
- dask
16+
abstract: |
17+
CubeDynamics (`cubedynamics`) is a streaming-first Python library for
18+
constructing multi-source climate data cubes, computing variance and
19+
correlation diagnostics, and exporting derived "lexcubes" for dashboards and
20+
climate resilience workflows.

README.md

Lines changed: 51 additions & 83 deletions
Original file line numberDiff line numberDiff line change
@@ -1,110 +1,78 @@
1-
# Climate Cube Math
1+
# CubeDynamics (`cubedynamics`)
22

3-
`cubedynamics` provides streaming access to climate data cubes plus reusable
4-
statistics, vegetation-index helpers, and QA visualizations for understanding
5-
Sentinel-2, GRIDMET, PRISM, and related datasets. This repository also
6-
includes a MkDocs site and reproducible vignette that document the package.
3+
CubeDynamics is a streaming-first climate cube math library for building
4+
multi-source climate data cubes (PRISM, gridMET, NDVI/Sentinel, etc.) and
5+
computing correlations, variance, and trends without downloading entire
6+
collections.
7+
8+
## Features
9+
10+
- **Streaming and chunked access** to climate datasets so analyses can begin
11+
before downloads finish.
12+
- **Climate lexcubes** – multi-dimensional cubes of climate statistics for
13+
comparing vegetation, weather, and derived metrics over shared axes.
14+
- **Correlation, synchrony, and variance cubes** that summarize temporal
15+
patterns such as drought stress, phenology shifts, and teleconnections.
16+
- **Notebook-friendly helpers** for Jupyter, VS Code, and workflow runners.
17+
- **Cloud and big-data ready** primitives that lean on `xarray`, `dask`, and
18+
lazy execution.
719

820
## Installation
921

10-
You can install the latest library directly from the repository:
22+
Once the package is published on PyPI you will be able to install it with:
23+
24+
```bash
25+
pip install cubedynamics
26+
```
27+
28+
Until then install directly from GitHub:
1129

1230
```bash
13-
pip install git+https://github.com/CU-ESIIL/climate_cube_math.git
31+
pip install "git+https://github.com/CU-ESIIL/climate_cube_math.git@main"
1432
```
1533

16-
During development use an editable install from the repo root:
34+
Developers can work against the repo in editable mode:
1735

1836
```bash
1937
python -m pip install -e .
2038
```
2139

22-
## Quick start
40+
## Quickstart
2341

2442
```python
2543
import cubedynamics as cd
2644

27-
s2 = cd.load_s2_cube(
28-
lat=43.89,
29-
lon=-102.18,
30-
start="2023-06-01",
31-
end="2023-09-30",
32-
edge_size=512,
45+
# Example: stream a gridMET cube for a region and compute a variance cube
46+
cube = cd.stream_gridmet_to_cube(
47+
aoi_geojson,
48+
variable="pr",
49+
dates=("2000-01", "2020-12"),
3350
)
34-
35-
ndvi = cd.compute_ndvi_from_s2(s2)
36-
ndvi_z = cd.zscore_over_time(ndvi)
37-
```
38-
39-
The same package still ships the ruled time hull helpers used in the training
40-
materials:
41-
42-
* `cubedynamics.demo.make_demo_event()` builds a small GeoDataFrame that mimics
43-
how a fire perimeter evolves through time.
44-
* `cubedynamics.hulls.plot_ruled_time_hull()` converts that data into a 3D ruled
45-
surface so the temporal pattern can be inspected visually.
46-
47-
## Documentation and vignette
48-
49-
The public website is generated from the `docs/` folder using MkDocs Material
50-
and includes:
51-
52-
1. A concise landing page that explains the project goals.
53-
2. A rendered copy of `docs/vignette.ipynb` so visitors can step through the
54-
example without leaving the site.
55-
3. An API reference driven by `mkdocstrings` that documents the core
56-
`cubedynamics` modules.
57-
58-
To preview the site locally run:
59-
60-
```bash
61-
mkdocs serve
51+
var_cube = cd.variance_cube(cube)
52+
var_cube.to_netcdf("gridmet_variance.nc")
6253
```
6354

64-
## Repository layout
65-
66-
```
67-
code/cubedynamics/ # installable Python package
68-
data/ # Sentinel-2, GRIDMET, PRISM loaders
69-
indices/ # vegetation index helpers
70-
stats/ # anomaly, rolling, correlation utilities
71-
viz/ # QA and lexcube visualization helpers
72-
utils/ # chunking, reference pixel helpers
73-
demo.py # demo GeoDataFrame generator
74-
hulls.py # ruled time hull plotting helper
75-
__init__.py
76-
77-
docs/
78-
index.md # landing page
79-
api.md # mkdocstrings API reference
80-
vignette.ipynb # notebook rendered on the site
81-
stylesheets/
82-
extra.css # small cosmetic tweaks for MkDocs Material
83-
84-
.github/workflows/pages.yml # deploys the docs site to GitHub Pages
85-
mkdocs.yml # MkDocs configuration
86-
pyproject.toml # package metadata
87-
```
55+
Additional helpers can build NDVI z-score cubes, compute rolling correlation vs
56+
an anchor pixel, or export “lexcubes” for downstream dashboards. Follow the docs
57+
for more end-to-end examples while the streaming implementations are finalized.
8858

89-
With this layout you only need to touch two places when extending the project:
90-
add or update Python modules inside `code/cubedynamics/` and describe those
91-
changes through Markdown or notebooks in `docs/`.
59+
## Documentation
9260

93-
## Installation
61+
Full documentation: https://cu-esiil.github.io/climate_cube_math/
9462

95-
For the streaming-first package preview install directly from GitHub:
63+
The GitHub Pages site hosts the narrative docs, quickstart, concepts, and API
64+
notes for CubeDynamics. Use `mkdocs serve` to preview changes locally.
9665

97-
```bash
98-
pip install "git+https://github.com/CU-ESIIL/climate_cube_math.git@main"
99-
```
66+
## Contributing
10067

101-
Once published to PyPI the goal is to allow a simple install:
68+
Contributions are welcome! Open an issue or pull request if you would like to
69+
add new data sources, improve the streaming primitives, or expand the
70+
statistical recipes. Please keep tests streaming-first (favor chunked I/O and
71+
mocked responses when possible) and include documentation updates alongside code
72+
changes.
10273

103-
```bash
104-
pip install cubedynamics
105-
```
74+
## Citation
10675

107-
`cubedynamics` follows a streaming-first philosophy that prefers chunked IO over
108-
full downloads. The accompanying pytest suite encodes this expectation by
109-
running streaming markers by default and skipping download-marked tests unless
110-
explicitly requested.
76+
If you use CubeDynamics in academic work, cite the project using the metadata in
77+
[`CITATION.cff`](CITATION.cff). A formal publication is planned; until then
78+
please cite the software release and repository.

docs/climate_cubes.md

Lines changed: 44 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,44 @@
1+
# Climate cubes
2+
3+
Climate cubes are the core abstraction in CubeDynamics. They are `xarray`
4+
objects with shared `(time, y, x)` axes (and optional `band` or `variable`
5+
dimensions) produced by the streaming loaders.
6+
7+
## Creating cubes
8+
9+
```python
10+
from cubedynamics import stream_gridmet_to_cube
11+
12+
cube = stream_gridmet_to_cube(
13+
aoi_geojson,
14+
variable="tmmx",
15+
dates=("2010-01-01", "2020-12-31"),
16+
)
17+
print(cube.dims)
18+
```
19+
20+
The loader harmonizes CRS, attaches metadata, and returns a lazily-evaluated
21+
`xarray.Dataset`. Other loaders follow the same interface (`stream_prism_to_cube`,
22+
`stream_sentinel2_to_cube`).
23+
24+
## Derived diagnostics
25+
26+
Once a cube exists, run statistics directly on the labeled dimensions:
27+
28+
```python
29+
from cubedynamics.stats.anomalies import zscore_over_time
30+
from cubedynamics.lexcubes.variance import variance_cube
31+
32+
ndvi_z = zscore_over_time(ndvi_cube, dim="time")
33+
var_cube = variance_cube(cube, dim="time")
34+
```
35+
36+
Every helper keeps the input axes intact so that downstream visualizations and
37+
exports can consume the resulting lexcube without regridding.
38+
39+
## Exporting cubes
40+
41+
`cubedynamics` exposes helpers like `cube.to_netcdf(...)`, `cube.to_zarr(...)`,
42+
or `lexcube_to_dataset(...)` to persist results and integrate with dashboards.
43+
Large analyses rely on chunked writes through `dask` so the same scripts run in
44+
cloud environments.

docs/concepts.md

Lines changed: 35 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,35 @@
1+
# Concepts Overview
2+
3+
CubeDynamics is organized into three conceptual layers that compose to produce
4+
climate lexcubes:
5+
6+
1. **Sources** – streaming adapters for Sentinel-2, PRISM, gridMET, and other
7+
gridded datasets. Each returns `xarray.Dataset` objects with shared
8+
`(time, y, x)` axes.
9+
2. **Cube math primitives** – functions inside `cubedynamics.stats`,
10+
`cubedynamics.indices`, and `cubedynamics.lexcubes` that compute anomalies,
11+
correlations, and derived indicators.
12+
3. **Pipelines & exports** – recipes that connect cubes to dashboards or models
13+
(NetCDF/Zarr writers, QA plots, and asynchronous workflows).
14+
15+
The sections below summarize how these layers interact.
16+
17+
## Source adapters
18+
19+
Every loader enforces consistent naming (time, y, x, band) and metadata (CRS,
20+
units, history). Streaming-first behavior is preferred: data arrive chunked via
21+
HTTP range requests, STAC assets, or cloud object storage signed URLs. Offline
22+
fallbacks download only the required slices.
23+
24+
## Lexcube builders
25+
26+
Lexcubes are multi-dimensional cubes that store derived metrics such as
27+
variance, synchrony, or NDVI anomalies along the same axes as the source data.
28+
They can be nested (e.g., a `dataset` containing multiple diagnostics) and are
29+
ready for export.
30+
31+
## Analysis & visualization
32+
33+
Downstream helpers provide rolling correlation, tail dependence, QA plots, and
34+
hooks for interactive dashboards. See [Climate cubes](climate_cubes.md) and
35+
[Correlation cubes](correlation_cubes.md) for example notebooks and API usage.

docs/concepts/backends.md

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -1,28 +1,28 @@
11
# Climate data backends
22

33
`cubedynamics` treats "climate cube" sources as interchangeable once they
4-
provide the standard `(time, y, x)` layout. The data loaders expose
4+
provide the standard `(time, y, x)` layout. The data loaders expose
55
consistent knobs and return dask-backed `xarray.Dataset` objects so the rest
66
of the math stack can focus on statistics instead of I/O details.
77

88
## Sentinel-2
99

10-
* Loader: `climate_cube_math.data.sentinel2.load_s2_cube`
10+
* Loader: `cubedynamics.stream_sentinel2_to_cube`
1111
* Purpose: multispectral reflectance for vegetation index and QA work.
1212
* Typical recipe: compute NDVI with
13-
`climate_cube_math.indices.vegetation.compute_ndvi_from_s2` and then derive
14-
z-scores or temporal anomalies with `climate_cube_math.stats.anomalies`.
13+
`cubedynamics.indices.vegetation.compute_ndvi_from_s2` and then derive
14+
z-scores or temporal anomalies with `cubedynamics.stats.anomalies`.
1515

1616
## GRIDMET
1717

18-
* Loader: `climate_cube_math.data.gridmet.load_gridmet_cube`
18+
* Loader: `cubedynamics.stream_gridmet_to_cube`
1919
* Purpose: daily meteorological drivers (temperature, precipitation, etc.).
2020
* Streaming-first design: attempts to return a lazily-evaluated cube, falling
2121
back to an in-memory download only when streaming is not available.
2222

2323
```python
24-
from climate_cube_math.data.gridmet import load_gridmet_cube
25-
from climate_cube_math.stats.anomalies import zscore_over_time
24+
from cubedynamics import stream_gridmet_to_cube
25+
from cubedynamics.stats.anomalies import zscore_over_time
2626

2727
aoi = {
2828
"min_lon": -105.4,
@@ -31,7 +31,7 @@ aoi = {
3131
"max_lat": 40.1,
3232
}
3333

34-
gridmet = load_gridmet_cube(
34+
gridmet = stream_gridmet_to_cube(
3535
variables=["tmax"],
3636
start="2000-01-01",
3737
end="2000-12-31",
@@ -44,13 +44,13 @@ tmax_z = zscore_over_time(gridmet["tmax"])
4444

4545
## PRISM
4646

47-
* Loader: `climate_cube_math.data.prism.load_prism_cube`
47+
* Loader: `cubedynamics.stream_prism_to_cube`
4848
* Purpose: high-resolution precipitation and temperature summaries.
4949
* Uses the same streaming-first contract as GRIDMET.
5050

5151
```python
52-
from climate_cube_math.data.prism import load_prism_cube
53-
from climate_cube_math.stats.anomalies import zscore_over_time
52+
from cubedynamics import stream_prism_to_cube
53+
from cubedynamics.stats.anomalies import zscore_over_time
5454

5555
aoi = {
5656
"min_lon": -105.4,
@@ -59,7 +59,7 @@ aoi = {
5959
"max_lat": 40.1,
6060
}
6161

62-
prism = load_prism_cube(
62+
prism = stream_prism_to_cube(
6363
variables=["ppt"],
6464
start="2000-01-01",
6565
end="2000-12-31",

docs/contributing.md

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
# Contributing
2+
3+
Thank you for helping build CubeDynamics! This project aims to keep climate cube
4+
math lightweight, streaming-first, and well documented. To contribute:
5+
6+
1. Fork or branch from `main` and create feature branches for your work.
7+
2. Install dependencies in editable mode (`python -m pip install -e .[dev]` once
8+
extras are defined) and run the test suite via `pytest`.
9+
3. Add or update documentation in `docs/` for every new feature or API change.
10+
4. Prefer streaming and chunked operations. If you add a function that downloads
11+
data, ensure it can operate lazily with `dask` when possible.
12+
5. Open a pull request describing the change, test coverage, and any data access
13+
requirements.
14+
15+
Issues and discussions are also welcome for roadmap ideas, new data sources, or
16+
lexcube visualizations.

0 commit comments

Comments
 (0)