Skip to content

Commit 90ba4e9

Browse files
authored
Nits (#53)
* rename book lock file and other small nb changes * small changes * small fixes
1 parent 704ff22 commit 90ba4e9

29 files changed

+80
-2032
lines changed

book/_config.yml

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,8 @@ only_build_toc_files: false
1313
# Force re-execution of notebooks on each build.
1414
# See https://jupyterbook.org/content/execute.html
1515
execute:
16-
execute_notebooks: cache #'auto'
16+
execute_notebooks: 'auto'
17+
1718
allow_errors: true
1819
timeout: 1500
1920
exclude_patterns:

book/background/4_tutorial_data.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -47,7 +47,7 @@ If you are unfamiliar with the principles of synthetic aperture radar (SAR) imag
4747
We provide a very brief overview of RTC processing below but it is not intended to replace the aforementioned resources.
4848
:::
4949

50-
```{figure} ../images/SARticle_first-fig_redone-06.jpg
50+
```{figure} imgs/SARticle_first-fig_redone-06.jpg
5151
---
5252
height: 250 px
5353
figclass: margin-caption

book/background/6_relevant_concepts.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,7 @@ Climate Forecast (CF) Metadata Conventions
5050
5151
>The CF metadata conventions are designed to promote the processing and sharing of files created with the NetCDF API. The conventions define metadata that provide a definitive description of what the data in each variable represents, as well as the spatial and temporal properties of the data. This enables users of data from different sources to decide which quantities are comparable and facilitates building applications with powerful extraction, regridding, and display capabilities. The CF convention includes a standard name table, which defines strings that identify physical quantities.
5252
53-
CF metadata conventions set common expectations for metadata names and locations across datasets. In this tutorial, we will use tools such as [cf_xarray]() that leverage CF conventions to add programmatic handling of CF metadata to Xarray objects, meaning users can spend less time wrangling metadata.
53+
CF metadata conventions set common expectations for metadata names and locations across datasets. In this tutorial, we will use tools such as [cf_xarray](https://cf-xarray.readthedocs.io/en/latest/) that leverage CF conventions to add programmatic handling of CF metadata to Xarray objects, meaning users can spend less time wrangling metadata.
5454
5555
Spatio-temporal Asset Catalog (STAC)
5656
STAC is a metadata specification for geospatial data that allows the data to be more easily "worked with, indexed, and discovered" [$\tiny \nearrow$](https://stacspec.org/en). It does this by setting a standard format for how metadata will be structured. This functions like setting a common expectation that all users of the data can rely on so that they know where certain information will be located and how it will be stored.

book/background/ard_data_tidying.md

Lines changed: 0 additions & 23 deletions
This file was deleted.

book/background/background.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
This section discusses different topics that are relevant to the material demonstrated in the tutorials.
44

5-
**[2.1 Context and motivation](1_context_and_motivation.md)**
5+
**[2.1 Context and motivation](1_context_motivation.md)**
66
Background on how the increasing availability of earth observation data and computational tools impacts how scientists conduct research.
77

88
**[2.2 Data cubes](2_data_cubes.md)**

book/background/software.md

Lines changed: 0 additions & 85 deletions
This file was deleted.

book/book_refs.bib

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -97,7 +97,7 @@ @book{Dasu_2003_Exploratory
9797
publisher = {John Wiley \& Sons, Inc.},
9898
address = {USA},
9999
edition = {1},
100-
abstract = {From the Publisher:A groundbreaking addition to the existing literature, Exploratory Data Mining and Data Cleaning serves as an important reference for data analysts who need to analyze large amounts of unfamiliar data, operations managers, and students in undergraduate or graduate-level courses, dealing with data analysis and data mining.}
100+
doi = {10.1002/0471448354}
101101
}
102102
@article{frantzFORCELandsatSentinel22019,
103103
title = {{{FORCE}}---{{Landsat}} + {{Sentinel-2 Analysis Ready Data}} and {{Beyond}}},
@@ -127,7 +127,7 @@ @inproceedings{geffner_2000_dynamic
127127
pages = {237--253},
128128
publisher = {Springer Berlin Heidelberg},
129129
address = {Berlin, Heidelberg},
130-
abstract = {Range sum queries on data cubes are a powerful tool for analysis. A range sum query applies an aggregation operation (e.g., SUM, AVERAGE) over all selected cells in a data cube, where the selection is specified by providing ranges of values for numeric dimensions. We present the Dynamic Data Cube, a new approach to range sum queries which provides efficient performance for both queries and updates, which handles clustered and sparse data gracefully, and which allows for the dynamic expansion of the data cube in any direction.},
130+
doi = {doi.org/10.1007/3-540-46439-5_17},
131131
isbn = {978-3-540-46439-6}
132132
}
133133

book/conclusion/datacubes_revisited.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -66,7 +66,7 @@ If you are working with a dataset where information about how the variables rela
6666

6767
##### 2. Compare two datasets by combining them into a single cube with an additional dimension
6868
- To compare data from different satellites within the ITS_LIVE dataset, we create a new data cube with a `'sensor'` dimension ([*ITS_LIVE tutorial, notebook 4 - exploratory analysis of a single glacier*](../itslive/nbs/4_exploratory_data_analysis_single.ipynb)*).
69-
- Adding `source` dimension when comparing ASF and PC backscatter datasets ([*Sentinel-1 tutorial, notebook 5 - comparing backscatter datasets*](../sentinel1/nbs/5_compare_backscatter_datasets.ipynb)).
69+
- Adding `source` dimension when comparing ASF and PC backscatter datasets ([*Sentinel-1 tutorial, notebook 5 - comparing backscatter datasets*](../sentinel1/nbs/5_comparing_s1_rtc_datasets.ipynb)).
7070
- In this example, the goal of our analysis changes from observing backscatter to observing how measurements of backscatter from two processing pipelines differ from one another. This implies a different shape of the data that is relevant to this question; the appropriate dimensions change from `(x, y, time, band)` to `(x, y, time, band, source)`.
7171
- Adding a source dimension let's us index the combined dataset by 'source' and compare the two 'source' elements on a common grid and scale.
7272

book/conclusion/nb_summaries.md

Lines changed: 0 additions & 130 deletions
This file was deleted.

0 commit comments

Comments
 (0)