Skip to content

Commit f9037d7

Browse files
vbrancatrtburns-jplLiang YugshiromaLiang Yu
authored andcommitted
InSAR workflow: Interferogram filtering (#806)
* Modify runconfig to include insar_filtering options * Expose different std in range/azimuth for gaussian filter in rubbersheet * Change blocksize to lines_per_block in insar_filtering schema/defaults * Add insar_filtering implementation * Add insar_filtering to persistence * Add insar_filtering to InSAR workflow * Add insar_filtering unit test * Address case of None interferogram_path * Remove check on filter type, simplify block reading * Add check to insar filtering cfg in insar runconfig * Correct checks for gaussian insar filtering dictionary * Correct kernels for gaussian filter and specify write block for h5py write_direct * Formatting to pep8 notation * Replace import pybind_isce with import isce3 * Switch kernel shape from list to tuple * Revamping block processing * Fix compilation error due to API change in HDF5 v1.12 HDF5 v1.12 changed the signature of H5Ovisit_by_name. Using the new API is actually more efficient, since we can avoid retrieving fields that aren't needed for the visitor function. I'm also checking H5_USE_110_API so that anyone explicitly opting for the old API won't be affected by this change. * Modify insar filtering unit test: * Remove int in favour of integer division * Remove print of RIFG path and correct typos * Remove unnecessary check for boxcar filter (present in share/nisar/defaults/insar.yaml) * Change name of step insar_filtering to filter_interferogram * Change name of step insar_filtering to filter_interferogram * Additonal block margin for GSLC and CUDA geocode (#833) * update block margin * revert changes brought by mistake from PR #783 * Update cxx/isce3/geocode/GeocodeCov.cpp * added extra margin for GSLC block interp to address gaps in output block boundaries. changes mirror PR 830 * port extra margin fix to CUDA geocode. std::min/max replaced to account for unsigned size_t Co-authored-by: Gustavo Shiroma <[email protected]> Co-authored-by: Liang Yu <[email protected]> * Rename pybind_nisar to nisar (#790) * rename pybind_nisar directories to nisar * replace pybind_nisar with nisar in code * Update GeocodeCov interp block margin (#830) * update block margin * revert changes brought by mistake from PR #783 * Update cxx/isce3/geocode/GeocodeCov.cpp Co-authored-by: Liang Yu <[email protected]> Co-authored-by: Liang Yu <[email protected]> * Antenna EL Power Pattern Estimation (#834) * Elevation Power Pattern Estimator module - Add new C++ class "ElPatternEst" under "isce3/antenna" - Add respective pybind11 module for "ElPatternEst" - Add a "detail" namespace along with new helper module "WinChirpRgCompRow" under "isce3/antenna". - Add a new overloaded C++ functionality "LookIncFromSr" to "geometry" module under "isce3/geometry" - Add a pybind11 module for overloaded functionality "LookIncFromSr". - Add a new C++ module "polyfunc" to perform 1-D polyfit functionalities under namespace "isce3/math". - Add a pybind11 wrapper for C++ class "Poly1d". - Add a C++ test suite for module "polyfunc" under "isce3/math". - Add a python test suite for pybind module "lookIncFromSr" under geometry - Add pybind test suite for class "ElPatternEst" - Add a L0B test data file from truncated ALOS1 PALSAR data used for V&V of "ElPatternEst". - Run autopep8 and clang-format * Fix issue #849 (#850) - Get rid of structured binding used in `lambda` capture in "geometry.cpp" due to problem with `clang` on Mac. * Fix variant usage for macOS < 10.14 macOS does not support std::bad_variant_access until 10.14, and some package managers target macos < 10.14, so we need to avoid parts of std::variant which might raise such exceptions * save interpolated DEM before evaluating if geocoded pixel is valid (#848) * Workflow unit tests fixed to run with CPU-only ISCE3 (#843) * correctly set rubbersheet at CUDA only in CMake * added CPU only check for geocode_insar test * Fix multi-channel RSLC processing (#844) * Fix doppler LUT write when we've already written one band. * Prevent string type promotion on numpy.append. * Whitespace between sentences. * Use better variable name. * Achieve type stability of polarization list in a simpler way. * Two LOC begone! * Add ground-track velocity to metadata cubes (#827) * add ground-track velocity to metadata cubes * rename pybind_nisar to nisar * rename pybind_nisar to nisar (2) * rename pybind_nisar to nisar (3) * rename pybind_nisar to nisar (4) * rename pybind_nisar to nisar (5) * remove static from writeVectorDerivedCubes() * Implement separate filtering function for other modules usage * Accomodate usage of freq/pol-based mask during interferogram filtering * Improve documentation for filter function * moved block param code to generator * moved IO standalone functions * made block parameters dataclass and restore missing imports * adapt changes to filter_interferogram * Quick PR to fix GCOV block artifacts due to insufficient DEM margin (#857) * fix DEM loading margin * change DEM loading margin default from 50 pixels to 100 pixels * use default DEM loading margin * addressed PR comments updated block param generator with clearer name and added pad shape x param names updated for clarity assigned output of GDAL array retrieval call fixed usage of filter padding other typo fixes * fixed incorrect input arg * fixed data dimension mix-up and incorrect output parameter in write function * documentation and cleanup removed extraneous parameters from generator added docstrings renamed filter to avoid conflict with built-in function * fix single pol debug * Fix topo for DEMs in projected coordinates (#780) * rename functions starting with GetDemCoords to getDemCoords * substitute topo interpolateXY() with getDemCoords() * update envisat golden dataset * update winnipeg golden dataset * make topo unitest print which files and bands it is checking * use ECEF coordinates rather than Earth radius to calculate height derivatives * remove unnecessary code * bring changes to CUDA topo * bring changes to CUDA topo (2) * bring changes to CUDA topo (3) * add z-coordinate to output projection inverse transform for cases in which the input ellipsoid differs from the output ellipsoid * add z-coordinate to output projection inverse transform for cases in which the input ellipsoid differs from the output ellipsoid (2) * declare variable and assign value in the same line * revert changes: declare variable and assign value in the same line * add docstrings to getDemCoordsSameEpsg() and getDemCoordsDiffEpsg() * Update cxx/isce3/geometry/Topo.cpp Co-authored-by: Geoffrey M Gunter <[email protected]> Co-authored-by: Geoffrey M Gunter <[email protected]> * Cost Function of (rising) Edge Method in EL Pointing Estimation (roll offset) (#856) - Add new C++ classes "RootFind1dNerwton" and "RootFind1dSecant" under "isce3/math" for finding the root of univariate function (1-D). The both are derived classes from a common base class "RootFind1dBase" sitting under "isce3/math/detail". - Add a single C++ test suite "root_find1d" for all root finding classes. - Add a C++ module "edge_method_cost_func" under "isce3/antenna" containing overloaded cost function as the core of rising "edge method" used for EL pointing estimation (roll angle offset). - Add a C++ test suite for "edge_method_cost_func". - Add a pybind11 module for "edge_method_cost_func". - Add a python test suite for "edge_method_cost_func" similar to its C++ counterpart. - Add a reference to "references.bib". - Remove extra blank line in module "geometryfunc.h" - Run clang-format - Run autopep8 * only save nlooks image if save_nlooks flag is set (#860) * Quick updates to GCOV workflow parameters (#854) * expose RTC DEM upsampling to the GCOV runconfig, set clip_min, clip_max, and rtc_min_value_db defaults to NaN * set RTC DEM upsampling default to 1 * Quick PR to add the listOfCovarianceTerms to GCOV products (#855) * add HDF5 dataset listOfCovarianceTerms to GCOV products * fix function _save_list_cov_terms() * fix function _save_list_cov_terms() (2) * fix function _save_list_cov_terms() (3) * extend _save_list_cov_terms() to off-diagonal terms * extend _save_list_cov_terms() to off-diagonal terms (2) * extend _save_list_cov_terms() to off-diagonal terms (3) * Improve documentation for filter_data.py * Remove commented lines of code * Remove unused variables * Correct WriteArray usage * Correct inverted raster shape for GDAL rasters * Expose lines per block in rdr2geo and geo2rdr (#861) * add getters/setters for rdr2geo and geo2rdr blocksize in C++ and CUDA * added lines per block to rdr2geo & geo2rdr pybind constructors * modified workflows for blocksize * update yamls * schema for rdr2geo and geo2rdr less restrictive * default rdr2geo threshold tightended * Add unit test for stand-alone usage of filter_data * isort imports for unit test * Add write_start_line to block generator * Replace filter_data unit test with test uning winnipeg data * added missing bits to make GDAL rasters work * added unit test for GDAL rasters * Convert core::Matrix from pyre::grid to Eigen (#858) * Convert isce3::core::Matrix pyre -> Eigen * Remove unused Cube class * Remove unused Matrix.icc * Remove commented code * Check size_t/ptrdiff_t coercion * Drop extraneous typename * Throw error for invalid pixel/line offset * Remove Row{Array/Matrix}XX aliases * Fix shadowed template parameter * Fix matrix base class * Fix indentation * Fix build with Eigen 3.4.0 Eigen 3.4.0 is here! https://eigen.tuxfamily.org/index.php?title=3.4#Changes_that_might_break_existing_code * Make DenseMatrix's initializer_list constructor explicit to match the corresponding constructor added in Eigen 3.4.0, and guard it to avoid an ambiguous overloaded constructor * Fix instances of float/double matrix indexing * increase kernel row and col size * modified path to match topo/topo.vrt * swap filter_data call and scipy convolve 2d order to test GDAL * fixed bugs from reordered filter calls in gdal test * Revert "fixed bugs from reordered filter calls in gdal test" This reverts commit b37e86b9a2fcf8e5f198be6373c6b5d33306552c. * Revert "swap filter_data call and scipy convolve 2d order to test GDAL" This reverts commit 63ad48b495ef31d22c2f6542b2dad9860747e825. * Revert "modified path to match topo/topo.vrt" This reverts commit 8d2f52031db90597c64b4049da8e3311bc12d65d. * Change dataset for gdal unit test to warped_winnipeg * gdal unit test transform VRT to Tiff file * Add close_dataset() to pybind Raster.cpp (#863) * add close_dataset() to pybind Raster.cpp * Update python/extensions/pybind_isce3/io/Raster.cpp Co-authored-by: Geoffrey M Gunter <[email protected]> * only call GDALClose from Raster destructor if attribute _dataset is not null * check GDAL dataset ownership before deleting it Co-authored-by: Geoffrey M Gunter <[email protected]> * Remove old ampcor from build scripts Michael will soon update our ampcor to the new version to support the changed pyre::grid API. For now, we disable building it so that we can update pyre. * Add PGE-requested fields to RSLC, GSLC, GCOV and InSAR schemas (#874) * Remove CoverageIndicator from RSLC schema * Add TrackFramePolygon and FullCoverageThresholdPercent in Geometry RSLC schema * Add PGE requested fields to GSLC schema * Add PGE requested fields in GCOV schema * Add PGE requested fields to InSAR schema * Correct typo FullCoverageThresholdPercent field Co-authored-by: vbrancat <[email protected]> Co-authored-by: vbrancat <[email protected]> Co-authored-by: Ryan Burns <[email protected]> Co-authored-by: Liang Yu <[email protected]> Co-authored-by: Gustavo Shiroma <[email protected]> Co-authored-by: Liang Yu <[email protected]> Co-authored-by: Hirad Ghaemi <[email protected]> Co-authored-by: Brian P Hawkins <[email protected]> Co-authored-by: Geoffrey M Gunter <[email protected]>
1 parent 9b8fb3d commit f9037d7

File tree

14 files changed

+878
-13
lines changed

14 files changed

+878
-13
lines changed

cxx/isce3/cuda/signal/gpuCrossMul.h

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -165,7 +165,7 @@ class isce3::cuda::signal::gpuCrossmul {
165165
bool _doCommonRangeBandFilter = false;
166166

167167
// number of lines per block
168-
size_t _rowsPerBlock = 8192;
168+
size_t _rowsPerBlock = 2048;
169169

170170
// upsampling factor
171171
size_t _oversample = 1;
Lines changed: 301 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,301 @@
1+
import os
2+
import isce3
3+
import h5py
4+
import numpy as np
5+
from osgeo import gdal
6+
from dataclasses import dataclass
7+
8+
@dataclass
9+
class BlockParam:
10+
'''
11+
Class for block specific parameters
12+
Facilitate block parameters exchange between functions
13+
'''
14+
# Length of current block to filter; padding not included
15+
block_length: int
16+
17+
# First line to write to for current block
18+
write_start_line: int
19+
20+
# First line to read from dataset for current block
21+
read_start_line: int
22+
23+
# Number of lines to read from dataset for current block
24+
read_length: int
25+
26+
# Padding to be applied to read in current block. First tuple is padding to
27+
# be applied to top/bottom (along length). Second tuple is padding to be
28+
# applied to left/right (along width). Values in second tuple do not change;
29+
# included in class so one less value is passed between functions.
30+
block_pad: tuple
31+
32+
# Width of current block. Value does not change per block; included to
33+
# in class so one less value is to be passed between functions.
34+
data_width: int
35+
36+
def np2gdal_dtype(np_dtype):
37+
dict_np2gdal = {
38+
np.byte: gdal.GDT_Byte,
39+
np.ushort: gdal.GDT_UInt16,
40+
np.short: gdal.GDT_Int16,
41+
np.uintc: gdal.GDT_UInt32,
42+
np.intc: gdal.GDT_Int32,
43+
np.float32: gdal.GDT_Float32,
44+
np.float64: gdal.GDT_Float64,
45+
np.complex64: gdal.GDT_CFloat32,
46+
np.complex128: gdal.GDT_CFloat64}
47+
if np_dtype not in dict_np2gdal:
48+
# throw unsupported error
49+
pass
50+
else:
51+
return dict_np2gdal[int_dtype]
52+
53+
def block_param_generator(lines_per_block, data_shape, pad_shape):
54+
''' Generator for block specific parameter class.
55+
56+
Parameters
57+
----------
58+
lines_per_block: int
59+
Lines to be processed per block (in batch).
60+
data_shape: tuple(int, int)
61+
Length and width of input raster.
62+
pad_shape: tuple(int, int)
63+
Padding for the length and width of block to be filtered.
64+
65+
Returns
66+
-------
67+
_: BlockParam
68+
BlockParam object for current block
69+
'''
70+
data_length, data_width = data_shape
71+
pad_length, pad_width = pad_shape
72+
73+
# Calculate number of blocks to break raster into
74+
num_blocks = int(np.ceil(data_length / lines_per_block))
75+
76+
for block in range(num_blocks):
77+
start_line = block * lines_per_block
78+
79+
# Discriminate between first, last, and middle blocks
80+
first_block = block == 0
81+
last_block = block == num_blocks - 1 or num_blocks == 1
82+
middle_block = not first_block and not last_block
83+
84+
# Determine block size; Last block uses leftover lines
85+
block_length = data_length - start_line if last_block else lines_per_block
86+
87+
# Determine padding along length. Full padding for middle blocks
88+
# Half padding for start and end blocks
89+
read_length_pad = pad_length if middle_block else pad_length // 2
90+
91+
# Determine 1st line of output
92+
write_start_line = block * lines_per_block
93+
94+
# Determine 1st dataset line to read. Subtract half padding length
95+
# to account for additional lines to be read.
96+
read_start_line = block * lines_per_block - pad_length // 2
97+
98+
# If applicable, save negative start line as deficit to account for later
99+
read_start_line, start_line_deficit = (
100+
0, read_start_line) if read_start_line < 0 else (
101+
read_start_line, 0)
102+
103+
# Initial guess at number lines to read; accounting for negative start at the end
104+
read_length = block_length + read_length_pad
105+
if not first_block:
106+
read_length -= abs(start_line_deficit)
107+
108+
# Check for over-reading and adjust lines read as needed
109+
end_line_deficit = min(
110+
data_length - read_start_line - read_length, 0)
111+
read_length -= abs(end_line_deficit)
112+
113+
# Determine block padding in length
114+
if first_block:
115+
# Only the top part of the block should be padded. If end_deficit_line=0
116+
# we have a sufficient number of lines to be read in the subsequent block
117+
top_pad = pad_length // 2
118+
bottom_pad = abs(end_line_deficit)
119+
elif last_block:
120+
# Only the bottom part of the block should be padded
121+
top_pad = abs(
122+
start_line_deficit) if start_line_deficit < 0 else 0
123+
bottom_pad = pad_length // 2
124+
else:
125+
# Top and bottom should be added taking into account line deficit
126+
top_pad = abs(
127+
start_line_deficit) if start_line_deficit < 0 else 0
128+
bottom_pad = abs(end_line_deficit)
129+
130+
block_pad = ((top_pad, bottom_pad),
131+
(pad_width // 2, pad_width // 2))
132+
133+
yield BlockParam(block_length, write_start_line, read_start_line, read_length, block_pad, data_width)
134+
135+
return
136+
137+
def get_raster_info(raster):
138+
''' Determine raster shape based on raster
139+
type (h5py.Dataset or GDAL-friendly raster).
140+
141+
Parameters
142+
----------
143+
raster: h5py.Dataset or str
144+
Raster whose size is to be determined. String value represents
145+
filepath for GDAL rasters.
146+
147+
Returns
148+
-------
149+
data_width: int
150+
Width of raster.
151+
data_length: int
152+
Length of raster.
153+
'''
154+
if isinstance(raster, h5py.Dataset):
155+
return raster.shape, raster.dtype
156+
else:
157+
# Open input data using GDAL to get raster length
158+
ds = gdal.Open(raster, gdal.GA_ReadOnly)
159+
data_length = ds.RasterYSize
160+
data_width = ds.RasterXSize
161+
data_type = ds.GetRasterBand(1).DataType
162+
return (data_length, data_width), data_type
163+
164+
def get_raster_block(raster, block_param):
165+
''' Get a block of data from raster.
166+
Raster can be a HDF5 file or a GDAL-friendly raster
167+
168+
Parameters
169+
----------
170+
raster: h5py.Dataset or str
171+
Raster where a block is to be read from. String value represents a
172+
filepath for GDAL rasters.
173+
block_param: BlockParam
174+
Object specifying size of block and where to read from raster,
175+
and amount of padding for the read array
176+
177+
Returns
178+
-------
179+
data_block: np.ndarray
180+
Block read from raster with shape specified in block_param.
181+
'''
182+
if isinstance(raster, h5py.Dataset):
183+
data_block = np.empty((block_param.read_length, block_param.data_width),
184+
dtype=raster.dtype)
185+
raster.read_direct(data_block, np.s_[block_param.read_start_line:
186+
block_param.read_start_line + block_param.read_length, :])
187+
else:
188+
# Open input data using GDAL to get raster length
189+
ds_data = gdal.Open(raster, gdal.GA_Update)
190+
data_block = ds_data.GetRasterBand(1).ReadAsArray(0,
191+
block_param.read_start_line,
192+
block_param.data_width,
193+
block_param.read_length)
194+
195+
# Pad igram_block with zeros according to pad_length/pad_width
196+
data_block = np.pad(data_block, block_param.block_pad,
197+
mode='constant', constant_values=0)
198+
199+
return data_block
200+
201+
def write_raster_block(out_raster, data, block_param):
202+
''' Write processed block to out_raster.
203+
204+
Parameters
205+
----------
206+
out_raster: h5py.Dataset or str
207+
Raster where data (i.e., filtered data) needs to be written.
208+
String value represents filepath for GDAL rasters.
209+
data: np.ndarray
210+
Filtered data to write to out_raster.
211+
block_param: BlockParam
212+
Object specifying where and how much to write to out_raster.
213+
'''
214+
if isinstance(out_raster, h5py.Dataset):
215+
out_raster.write_direct(data,
216+
dest_sel=np.s_[
217+
block_param.write_start_line:block_param.write_start_line + block_param.block_length,
218+
:])
219+
else:
220+
ds_data = gdal.Open(out_raster, gdal.GA_Update)
221+
ds_data.GetRasterBand(1).WriteArray(data, xoff=0, yoff=block_param.write_start_line)
222+
223+
224+
def filter_data(input_data, lines_per_block,
225+
kernel_rows, kernel_cols, output_data=None, mask_path=None):
226+
''' Filter data using two separable 1D kernels.
227+
228+
Parameters
229+
----------
230+
input_data: str
231+
File path to input data raster (GDAL-friendly)
232+
lines_per_block: int
233+
Number of lines to process in batch
234+
kernel_rows: float array
235+
1D kernel along rows direction
236+
kernel_cols: float array
237+
1D kernel along columns direction
238+
output_data: h5py.Dataset or str
239+
Raster where a block needs to be written to. String value represents
240+
file path for GDAL rasters. If not provided, input_data is overwritten
241+
with the output filtered data
242+
mask_path: str
243+
Filepath to the mask to use during filtering
244+
245+
Returns
246+
-------
247+
'''
248+
249+
data_shape, data_type = get_raster_info(input_data)
250+
data_length, data_width = data_shape
251+
252+
# Determine the amount of padding
253+
pad_length = 2 * (len(kernel_rows) // 2)
254+
pad_width = 2 * (kernel_cols.shape[1] // 2)
255+
pad_shape = (pad_length, pad_width)
256+
257+
# Determine number of blocks to process
258+
lines_per_block = min(data_length,
259+
lines_per_block)
260+
261+
# Start block processing
262+
block_params = block_param_generator(lines_per_block, data_shape, pad_shape)
263+
for block_param in block_params:
264+
# Read a block of data. If hdf5_dset is set, read a block of data
265+
# directly from the hdf5 file. Otherwise, use gdal to read block of data
266+
data_block = get_raster_block(input_data, block_param)
267+
268+
# Get if filtering needs to be performed with or without a mask
269+
if mask_path is not None:
270+
# Use gdal to extract a mask block, pad the mask (mask need to be same shape as input)
271+
ds_mask = gdal.Open(mask_path,
272+
gdal.GA_ReadOnly)
273+
mask_block = ds_mask.GetRasterBand(1).ReadAsArray(0,
274+
block_param.read_start_line,
275+
block_param.data_width,
276+
block_param.read_length)
277+
mask_block = np.pad(mask_block, block_param.block_pad,
278+
mode='constant', constant_values=0)
279+
filt_data_block = isce3.signal.convolve2D(data_block,
280+
mask_block,
281+
kernel_cols,
282+
kernel_rows,
283+
False)
284+
else:
285+
filt_data_block = isce3.signal.convolve2D(data_block,
286+
kernel_cols,
287+
kernel_rows,
288+
False)
289+
# If no value provided for output_data, then overwrite existing
290+
# input with filtered output
291+
# Otherwise write filtered output to output_data
292+
out_raster = input_data if output_data is None else output_data
293+
294+
# If writing to GDAL raster, prepare file
295+
if not isinstance(out_raster, h5py.Dataset) and not os.path.isfile(out_raster):
296+
raster = isce3.io.Raster(path=out_raster, width=data_width,
297+
length=data_length, num_bands=1,
298+
dtype=data_type, driver_name='GTiff')
299+
del raster
300+
301+
write_raster_block(out_raster, filt_data_block, block_param)

0 commit comments

Comments
 (0)