Skip to content

Commit e97df21

Browse files
author
Luc Hermitte
committed
Merge branch '168-invalid-parameter-used-in-eodag-search-request-for-polarisation-mode' into 'develop'
Resolve "Invalid parameter used in eodag search request for polarisation mode" Closes #168 See merge request s1-tiling/s1tiling!138
2 parents 406b4ef + ced5c08 commit e97df21

File tree

12 files changed

+127
-216
lines changed

12 files changed

+127
-216
lines changed

docs/FAQ.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ This Q/A is tracked in `S1Tiling issue #70
6262
Q: How can I overcome timeouts when searching for online products?
6363
------------------------------------------------------------------
6464

65-
Some data providers like PEPS may fail to obtain in time the list of products
65+
Some data providers like Geodes may fail to obtain in time the list of products
6666
matching our criteria.
6767

6868
Since `EODAG <https://github.com/CS-SI/eodag>`_ v2.11.0, we can override the

docs/configuration.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -160,12 +160,12 @@ You can use this :download:`this template
160160
See :external+eodag:std:doc:`EODAG § on Configure EODAG
161161
<getting_started_guide/configure>`
162162

163-
For instance, given a PEPS account, :file:`$HOME/.config/eodag/eodag.yml`
163+
For instance, given a geodes account, :file:`$HOME/.config/eodag/eodag.yml`
164164
could contain
165165

166166
.. code-block:: yaml
167167
168-
peps:
168+
geodes:
169169
auth:
170170
credentials:
171171
username: THEUSERNAME

docs/intro.rst

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -40,14 +40,14 @@ existing CNES open source project `Orfeo Tool Box <https://www.orfeo-toolbox.org
4040

4141
The resulting images are gridded to Sentinel-2 MGRS geographic reference grid (`S2 tiling system - kml file <https://sentinel.esa.int/documents/247904/1955685/S2A_OPER_GIP_TILPAR_MPC__20151209T095117_V20150622T000000_21000101T000000_B00.kml>`_).
4242
Thanks to `EODAG <https://eodag.readthedocs.io/>`_ , different Sentinel-1 data providers can be used
43-
like `PEPS <https://peps.cnes.fr/>`_ or `Copernicus Data Space <https://scihub.copernicus.eu>`_.
43+
like `Geodes <https://geodes-portal.cnes.fr/>`_ or `Copernicus Data Space <https://scihub.copernicus.eu>`_.
4444
It can be used on any type of platform, from a large computing cluster to a
4545
laptop (the fan will make some noise during processing). It is considerably
4646
faster than the ortho-rectification tool in ESA SNAP software with similar results and can be easily used in
4747
script form.
4848

49-
S1Tiling is currently used for many applications, such deforestation detection
50-
in the Amazon, monitoring of rice crops in Southeast Asia or monitoring of
51-
water stocks in India. In addition, this software is accessible as an on-demand
52-
processing service on the French PEPS collaborative ground segment, in order to
53-
make it easier for users to use.
49+
S1Tiling is currently used as Sentinel-1 data pre-processing for many
50+
applications, such deforestation detection in the Amazon, monitoring of rice
51+
crops in Southeast Asia or monitoring of water stocks in India. In addition,
52+
this software will be implemented in GEODES, the French portal for Earth
53+
Observation data, in order to provide SAR Ready Analysis Data to users.

docs/release_notes.rst

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -105,6 +105,9 @@ v1.2.0 Improvements
105105
uppercase or lowercase. A new time stamp is also defined to hold the first
106106
time stamp among the ones from the input S1 images
107107
(`#188 <https://gitlab.orfeo-toolbox.org/s1-tiling/s1tiling/-/issues/188>`_).
108+
- Product downloading has been fixed to work with all data providers supported
109+
by EODAG v3.9.0+
110+
(`#168 <https://gitlab.orfeo-toolbox.org/s1-tiling/s1tiling/-/issues/168>`_).
108111

109112
v1.2.0 Bugs fixed
110113
+++++++++++++++++

s1tiling/libs/S1FileManager.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -499,7 +499,7 @@ def _search_products( # pylint: disable=too-many-arguments, too-many-locals
499499
assert polarization in ['VV VH', 'VV', 'VH', 'HH HV', 'HH', 'HV']
500500
# In case only 'VV' or 'VH' is requested, we still need to
501501
# request 'VV VH' to the data provider through eodag.
502-
dag_polarization_param = 'VV VH' if polarization in ['VV VH', 'VV', 'VH'] else 'HH HV'
502+
dag_polarization_param = 'VV+VH' if polarization in ['VV VH', 'VV', 'VH'] else 'HH+HV'
503503
dag_orbit_dir_param = k_dir_assoc.get(orbit_direction or "", None) # None => all ; <<or "">> used to silence mypy
504504
dag_orbit_list_param = relative_orbit_list[0] if len(relative_orbit_list) == 1 else None
505505
dag_platform_list_param = platform_list[0] if len(platform_list) == 1 else None
@@ -515,7 +515,7 @@ def _search_products( # pylint: disable=too-many-arguments, too-many-locals
515515
start=first_date, end=last_date,
516516
box=extent,
517517
# If we have eodag v1.6+, we try to filter product during the search request
518-
polarizationMode=dag_polarization_param,
518+
polarizationChannels=dag_polarization_param,
519519
sensorMode="IW",
520520
orbitDirection=dag_orbit_dir_param, # None => all
521521
relativeOrbitNumber=dag_orbit_list_param, # List doesn't work. Single number yes!

s1tiling/libs/configuration.py

Lines changed: 42 additions & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -275,6 +275,14 @@ def getboolean(self, section: str, name: str, **kwargs) -> bool:
275275
"""Helper function to report errors while extracting boolean configuration options"""
276276
return getboolean_opt(self.__config, self.config_file, section, name, **kwargs)
277277

278+
def getoptionalboolean(self, section: str, name: str, **kwargs) -> Optional[bool]:
279+
"""Helper function to report errors while extracting optional boolean configuration options"""
280+
try:
281+
return getboolean_opt(self.__config, self.config_file, section, name, **kwargs)
282+
except Exception: # pylint: disable=broad-except
283+
# We cannot use "fallback=None" to handle ": None" w/ getboolean()
284+
return None
285+
278286
def get_items(self, section: str) -> Dict:
279287
"""Helper function to return configuration items from a section"""
280288
res = {}
@@ -453,16 +461,13 @@ def __init_processing(self, accessor: _ConfigAccessor) -> None:
453461
accessor.throw(f"Unexpected value for Processing.cache_dem_by option: '{self.cache_dem_by}' is neither 'copy' nor 'symlink'")
454462

455463
# - - - - - - - - - -[ Cut margins
456-
try:
457-
self.override_azimuth_cut_threshold_to: Optional[bool] = accessor.getboolean('Processing', 'override_azimuth_cut_threshold_to')
458-
except Exception: # pylint: disable=broad-except
459-
# We cannot use "fallback=None" to handle ": None" w/ getboolean()
460-
#: Internal to override analysing of top/bottom cutting: See :ref:`[Processing.override_azimuth_cut_threshold_to] <Processing.override_azimuth_cut_threshold_to>`
461-
self.override_azimuth_cut_threshold_to = None
464+
#: Internal to override analysing of top/bottom cutting: See :ref:`[Processing.override_azimuth_cut_threshold_to] <Processing.override_azimuth_cut_threshold_to>`
465+
self.override_azimuth_cut_threshold_to: Optional[bool] = accessor.getoptionalboolean('Processing', 'override_azimuth_cut_threshold_to')
462466

463467
# - - - - - - - - - -[ Calibration
464468
#: SAR Calibration applied: See :ref:`[Processing.calibration] <Processing.calibration>`
465469
self.calibration_type = accessor.get('Processing', 'calibration')
470+
466471
#: Shall we remove thermal noise: :ref:`[Processing.remove_thermal_noise] <Processing.remove_thermal_noise>`
467472
self.removethermalnoise = accessor.getboolean('Processing', 'remove_thermal_noise')
468473
if self.removethermalnoise and otb_version() < '7.4.0':
@@ -477,27 +482,27 @@ def __init_processing(self, accessor: _ConfigAccessor) -> None:
477482

478483
# - - - - - - - - - -[ Gamma area computation
479484
#: Resampling: See :ref:`[Processing.use_resampled_dem] <Processing.use_resampled_dem>`
480-
self.use_resampled_dem = accessor.getboolean('Processing', 'use_resampled_dem', fallback=True)
481-
no_use_resampled_dem = accessor.getboolean('Processing', 'no_use_resampled_dem', fallback=None)
485+
self.use_resampled_dem = accessor.getboolean('Processing', 'use_resampled_dem', fallback=True)
486+
no_use_resampled_dem = accessor.getboolean('Processing', 'no_use_resampled_dem', fallback=None)
482487
if no_use_resampled_dem is not None:
483488
accessor.throw("'no_use_resampled_dem' has be deprecated, please use the positive option: 'use_resampled_dem' instead")
484489

485490
#: Resampling: See :ref:`[Processing.factor_x] <Processing.resample_dem_factor_x>`
486-
self.resample_dem_factor_x :float = accessor.getfloat('Processing', 'resample_dem_factor_x', fallback=2.0)
491+
self.resample_dem_factor_x :float = accessor.getfloat('Processing', 'resample_dem_factor_x', fallback=2.0)
487492
#: Resampling: See :ref:`[Processing.factor_y] <Processing.resample_dem_factor_y>`
488-
self.resample_dem_factor_y :float = accessor.getfloat('Processing', 'resample_dem_factor_y', fallback=2.0)
493+
self.resample_dem_factor_y :float = accessor.getfloat('Processing', 'resample_dem_factor_y', fallback=2.0)
489494

490495
#: Gamma area: See :ref:`[Processing.distribute_area] <Processing.distribute_area>`
491-
self.distribute_area :bool = accessor.getboolean('Processing', 'distribute_area', fallback=False)
496+
self.distribute_area :bool = accessor.getboolean('Processing', 'distribute_area', fallback=False)
492497
#: Gamma area: See :ref:`[Processing.inner_margin_ratio] <Processing.inner_margin_ratio>`
493-
self.inner_margin_ratio :float = accessor.getfloat('Processing', 'inner_margin_ratio', fallback=0.01)
498+
self.inner_margin_ratio :float = accessor.getfloat('Processing', 'inner_margin_ratio', fallback=0.01)
494499
#: Gamma area: See :ref:`[Processing.outer_margin_ratio] <Processing.outer_margin_ratio>`
495-
self.outer_margin_ratio :float = accessor.getfloat('Processing', 'outer_margin_ratio', fallback=0.04)
500+
self.outer_margin_ratio :float = accessor.getfloat('Processing', 'outer_margin_ratio', fallback=0.04)
496501

497502
#: Gamma area to gamma naught rtc: See :ref:`[Processing.min_gamma_area] <Processing.min_gamma_area>`
498-
self.min_gamma_area :float = accessor.getfloat('Processing', 'min_gamma_area', fallback=1.0)
503+
self.min_gamma_area :float = accessor.getfloat('Processing', 'min_gamma_area', fallback=1.0)
499504
#: Gamma area to gamma naught rtc: See :ref:`[Processing.calibration_factor] <Processing.calibration_factor>`
500-
self.calibration_factor :float = accessor.getfloat('Processing', 'calibration_factor', fallback=1.0)
505+
self.calibration_factor :float = accessor.getfloat('Processing', 'calibration_factor', fallback=1.0)
501506

502507
# - - - - - - - - - -[ Orthorectification
503508
#: Pixel size (in meters) of the output images: :ref:`[Processing.output_spatial_resolution] <Processing.output_spatial_resolution>`
@@ -514,21 +519,9 @@ def __init_processing(self, accessor: _ConfigAccessor) -> None:
514519
if not os.path.isfile(self.output_grid):
515520
accessor.throw(f"output_grid={self.output_grid} is not a valid path")
516521

517-
# IF tiles_list_in_file is set, use the option, and throw if there is an error
518-
# ELSE: if unset, then use "tiles" option
519-
tiles_file = accessor.get('Processing', 'tiles_list_in_file', fallback=None)
520-
if tiles_file:
521-
try:
522-
with open(tiles_file, 'r', encoding='utf-8') as tiles_file_handle:
523-
tile_list = tiles_file_handle.readlines()
524-
self.tile_list: List[str] = [s.rstrip() for s in tile_list]
525-
logging.info("The following tiles will be processed: %s", self.tile_list)
526-
except Exception as e: # pylint: disable=broad-exception-caught
527-
accessor.throw(f"Cannot read tile list file {tiles_file!r}", e)
528-
else:
529-
tiles = accessor.get('Processing', 'tiles')
530-
#: List of S2 tiles to process: See :ref:`[Processing.tiles] <Processing.tiles>`
531-
self.tile_list = _split_option(tiles)
522+
#: List of S2 tiles to process: See :ref:`[Processing.tiles] <Processing.tiles>`
523+
self.tile_list : List[str] = self.__extract_tile_list_option(accessor)
524+
# logging.info("The following tiles will be processed: %s", self.tile_list)
532525

533526
# - - - - - - - - - -[ Parallelization & RAM
534527
#: Number of tasks executed in parallel: See :ref:`[Processing.nb_parallel_processes] <Processing.nb_parallel_processes>`
@@ -586,6 +579,25 @@ def __init_filtering(self, accessor: _ConfigAccessor) -> None:
586579
else:
587580
accessor.throw(f"Invalid despeckling filter value '{self.filter}'. Select one among none/lee/frost/gammamap/kuan")
588581

582+
def __extract_tile_list_option(self, accessor: _ConfigAccessor) -> List[str]: # pylint: disable=inconsistent-return-statements
583+
# NB: pylint is unable to see accessor.throw() is NoReturn, hence the disable=inconsistent-return-statements
584+
# It seems related to https://github.com/pylint-dev/pylint/issues/9692
585+
586+
# IF tiles_list_in_file is set, use the option, and throw if there is an error
587+
# ELSE: if unset, then use "tiles" option
588+
tiles_file = accessor.get('Processing', 'tiles_list_in_file', fallback=None)
589+
if tiles_file:
590+
try:
591+
with open(tiles_file, 'r', encoding='utf-8') as tiles_file_handle:
592+
tile_list = tiles_file_handle.readlines()
593+
return [s.rstrip() for s in tile_list]
594+
except Exception as e: # pylint: disable=broad-exception-caught
595+
accessor.throw(f"Cannot read tile list file {tiles_file!r}", e)
596+
else:
597+
tiles = accessor.get('Processing', 'tiles')
598+
return _split_option(tiles)
599+
600+
589601
# ----------------------------------------------------------------------
590602
def __init_fname_fmt(self, accessor: _ConfigAccessor) -> None:
591603
# Permit to override default file name formats

s1tiling/libs/orbit/_manager.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -302,6 +302,7 @@ def _fetch_eof_files( # pylint: disable=too-many-arguments
302302
def search_for( # pylint: disable=too-many-arguments
303303
self,
304304
relative_orbits: List[int],
305+
*,
305306
missions : Iterable[str] = (),
306307
first_date : Optional[datetime] = None,
307308
last_date : Optional[datetime] = None,

s1tiling/libs/otbpipeline.py

Lines changed: 0 additions & 110 deletions
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,6 @@
4343
from itertools import filterfalse
4444
import logging
4545
import logging.handlers
46-
import multiprocessing
4746
from typing import Dict, Generic, List, Optional, Protocol, Set, Tuple, Type, TypeVar, Union, runtime_checkable
4847

4948
from distributed import get_worker
@@ -1061,8 +1060,6 @@ def generate_tasks(self, do_watch_ram=False) -> Tuple[TaskNodeDict, List[str], L
10611060
return tasks, final_products, []
10621061

10631062

1064-
# ======================================================================
1065-
# Multi processing related (old) code
10661063
def mp_worker_config(queue):
10671064
"""
10681065
Worker configuration function called by Pool().
@@ -1077,110 +1074,3 @@ def mp_worker_config(queue):
10771074
global logger
10781075
logger = logging.getLogger()
10791076
logger.addHandler(qh)
1080-
1081-
1082-
# TODO: try to make it static...
1083-
def execute4mp(pipeline):
1084-
"""
1085-
Internal worker function used by multiprocess to execute a pipeline.
1086-
"""
1087-
return pipeline.do_execute()
1088-
1089-
1090-
class PoolOfOTBExecutions:
1091-
"""
1092-
Internal multiprocess Pool of OTB pipelines.
1093-
"""
1094-
def __init__(self,
1095-
title,
1096-
do_measure,
1097-
nb_procs, nb_threads,
1098-
log_queue, log_queue_listener,
1099-
debug_otb) -> None:
1100-
"""
1101-
constructor
1102-
"""
1103-
self.__pool = []
1104-
self.__title = title
1105-
self.__do_measure = do_measure
1106-
self.__nb_procs = nb_procs
1107-
self.__nb_threads = nb_threads
1108-
self.__log_queue = log_queue
1109-
self.__log_queue_listener = log_queue_listener
1110-
self.__debug_otb = debug_otb
1111-
1112-
def new_pipeline(self, **kwargs):
1113-
"""
1114-
Register a new pipeline.
1115-
"""
1116-
in_memory = kwargs.get('in_memory', True)
1117-
do_watch_ram = kwargs.get('do_watch_ram', False)
1118-
pipeline = Pipeline(self.__do_measure, in_memory, do_watch_ram)
1119-
self.__pool.append(pipeline)
1120-
return pipeline
1121-
1122-
def process(self):
1123-
"""
1124-
Executes all the pipelines in parallel.
1125-
"""
1126-
nb_cmd = len(self.__pool)
1127-
1128-
os.environ["ITK_GLOBAL_DEFAULT_NUMBER_OF_THREADS"] = str(self.__nb_threads)
1129-
os.environ["GDAL_NUM_THREADS"] = str(self.__nb_threads)
1130-
os.environ['OTB_LOGGER_LEVEL'] = 'DEBUG'
1131-
if self.__debug_otb: # debug OTB applications with gdb => do not spawn process!
1132-
execute4mp(self.__pool[0])
1133-
else:
1134-
with multiprocessing.Pool(self.__nb_procs, mp_worker_config, [self.__log_queue]) as pool:
1135-
self.__log_queue_listener.start()
1136-
for count, result in enumerate(pool.imap_unordered(execute4mp, self.__pool), 1):
1137-
logger.info("%s correctly finished", result)
1138-
logger.info(' --> %s... %s%%', self.__title, count * 100. / nb_cmd)
1139-
1140-
pool.close()
1141-
pool.join()
1142-
self.__log_queue_listener.stop()
1143-
1144-
1145-
class Processing:
1146-
"""
1147-
Entry point for executing multiple instance of the same pipeline of
1148-
different inputs.
1149-
1150-
1. The object is initialized with a log queue and its listener
1151-
2. The pipeline is registered with a list of :class`StepFactory` s
1152-
3. The processing is done on a list of :class:`FirstStep` s
1153-
"""
1154-
def __init__(self, cfg, debug_otb) -> None:
1155-
self.__log_queue = cfg.log_queue
1156-
self.__log_queue_listener = cfg.log_queue_listener
1157-
self.__cfg = cfg
1158-
self.__factory_steps = []
1159-
self.__debug_otb = debug_otb
1160-
1161-
def register_pipeline(self, factory_steps):
1162-
"""
1163-
Register a list of :class:`StepFactory` s that describes a pipeline.
1164-
"""
1165-
# Automatically append the final storing step
1166-
self.__factory_steps = factory_steps + [Store]
1167-
1168-
def process(self, startpoints):
1169-
"""
1170-
Defines pipelines from the registered steps. Each pipeline is instanciated with a
1171-
startpoint. Then they registered into the PoolOfOTBExecutions.
1172-
The pool is finally executed.
1173-
"""
1174-
assert self.__factory_steps
1175-
pool = PoolOfOTBExecutions("testpool", True,
1176-
self.__cfg.nb_procs, self.__cfg.OTBThreads,
1177-
self.__log_queue, self.__log_queue_listener, debug_otb=self.__debug_otb)
1178-
for startpoint in startpoints:
1179-
logger.info("register processing of %s", startpoint.basename)
1180-
pipeline = pool.new_pipeline(in_memory=True)
1181-
pipeline.set_inputs(startpoint)
1182-
for factory in self.__factory_steps:
1183-
pipeline.push(factory(self.__cfg))
1184-
1185-
logger.debug('Launch pipelines')
1186-
pool.process()

0 commit comments

Comments
 (0)