You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: tutorials/parquet-catalog-demos/irsa-hats-with-lsdb.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -19,7 +19,7 @@ kernelspec:
19
19
20
20
By the end of this tutorial, you will learn how to:
21
21
22
-
- Use the `lsdb` library to access IRSA HATS collections from the cloud.
22
+
- Use the `lsdb` library to access IRSA HATS Collections from the cloud.
23
23
- Define spatial, column, and row filters to read only a portion of large HATS catalogs.
24
24
- Crossmatch catalogs using `lsdb` and visualize the results.
25
25
- Perform index searches on HATS catalogs using `lsdb`.
@@ -36,7 +36,7 @@ For more details on HATS collections, partitioning, and schema organization, see
36
36
This notebook demonstrates how to access and analyze HATS collections using the [lsdb](https://docs.lsdb.io/en/latest/index.html) Python library, which makes it convenient to work with these large datasets.
37
37
We will explore the following IRSA HATS collections in this tutorial:
38
38
39
-
- Euclid Q1: MER (multi-wavelength mosaics) final catalog, photometric redshift catalogs, and spectroscopic catalogs joined on MER Object ID.
39
+
- Euclid Q1 Merged Objects: 14 Euclid Q1 tables, including MER (multi-wavelength mosaics) final catalog, photometric redshift catalogs, and spectroscopic catalogs joined on MER Object ID.
40
40
- ZTF DR23 Objects Table: catalog of PSF-fit photometry detections extracted from ZTF reference images, including "collapsed-lightcurve" metrics.
41
41
- ZTF DR23 Lightcurves: catalog of PSF-fit photometry detections extracted from single-exposure images at the locations of Objects Table detections.
42
42
@@ -48,7 +48,7 @@ We will use lsdb to leverage HATS partitioning for performing fast spatial queri
48
48
49
49
```{code-cell} ipython3
50
50
# Uncomment the next line to install dependencies if needed.
- This catalog is opened [lazily](https://docs.lsdb.io/en/latest/tutorials/lazy_operations.html), i.e., no data is read from the S3 bucket into memory yet.
144
144
- Since we did not specify any columns to select from this very wide catalog, we get 7 default columns out of 1593 available columns!
145
-
- We see the HEALPix order and pixels at which this catalog is partitioned.
145
+
- We see some of the HEALPix orders and pixels at which this catalog is partitioned.
146
146
147
147
Let's plot the sky coverage of this catalog:
148
148
149
149
```{code-cell} ipython3
150
150
euclid_catalog.plot_pixels()
151
151
```
152
152
153
-
Since HATS uses adaptive tile-based partitioning, we see higher order HealPix tiles in regions with higher source density and lower order tiles in regions with lower source density.
153
+
Since HATS uses adaptive tile-based partitioning, we see higher order HEALPix tiles in regions with higher source density and lower order tiles in regions with lower source density.
154
154
155
155
Now let's open the ZTF DR23 Objects catalog in a similar manner and view its sky coverage:
156
156
@@ -177,7 +177,7 @@ euclid_DF_N_center
177
177
```
178
178
179
179
```{code-cell} ipython3
180
-
# euclid_DF_N_radius = 3 * u.deg # ceil(sqrt(22.9 / pi)) approximate radius to cover almost entire DF-N
180
+
# euclid_DF_N_radius = 3 * u.deg # ceil(sqrt(22.9 / pi)) approximate radius to cover almost entire EDF-N
181
181
euclid_DF_N_radius = 0.5 * u.deg # smaller radius to reduce execution time for this tutorial
0 commit comments