diff --git a/README.md b/README.md index 7a7eb47..6461a35 100644 --- a/README.md +++ b/README.md @@ -12,7 +12,7 @@ can be used through command-line scripts. [![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.4281480.svg)](https://doi.org/10.5281/zenodo.4281480) [![build](https://github.com/nfsi-canada/OBStools/workflows/Build/badge.svg)](https://github.com/nfsi-canada/OBStools/actions) -[![codecov](https://codecov.io/gh/nfsi-canada/OBStools/branch/master/graph/badge.svg)](https://codecov.io/gh/nfsi-canada/OBStools) + Installation, API documentation, scripts and tutorials are described at https://nfsi-canada.github.io/OBStools/ diff --git a/docs/atacr.rst b/docs/atacr.rst index e10797b..a66d3ed 100644 --- a/docs/atacr.rst +++ b/docs/atacr.rst @@ -38,7 +38,7 @@ Compliance is defined as the spectral ratio between pressure and vertical displacement data. Compliance noise arises from seafloor deformation due to seafloor and water wave effects (including infragravity waves). This is likely the main source of noise in vertical component OBS data. -This analysis therefore requires both vertical (``?HZ``) and pressure (``?XH``) data. +This analysis therefore requires both vertical (``?HZ``) and pressure (``?DH``) data. **Tilt** @@ -203,7 +203,7 @@ program values or by refining parameters. All of them use a station database pro Description ----------- -Downloads up to four-component (H1, H2, Z and P), day-long seismograms +Downloads up to four-component (1, 2, Z and P), day-long seismograms to use in noise corrections of vertical component data. Station selection is specified by a network and station code. The database is provided as a ``StDb`` dictionary. @@ -214,89 +214,94 @@ Usage .. code-block:: $ atacr_download_data -h + + ########################################################################### + # _ _ _ _ _ # + # __| | _____ ___ __ | | ___ __ _ __| | __| | __ _| |_ __ _ # + # / _` |/ _ \ \ /\ / / '_ \| |/ _ \ / _` |/ _` | / _` |/ _` | __/ _` | # + # | (_| | (_) \ V V /| | | | | (_) | (_| | (_| | | (_| | (_| | || (_| | # + # \__,_|\___/ \_/\_/ |_| |_|_|\___/ \__,_|\__,_|___\__,_|\__,_|\__\__,_| # + # |_____| # + # # + ########################################################################### + usage: atacr_download_data [options] - Script used to download and pre-process up to four-component (H1, H2, Z and - P), day-long seismograms to use in noise corrections of vertical component of - OBS data. Data are requested from the internet using the client services - framework for a given date range. The stations are processed one by one and - the data are stored to disk. + Script used to download and pre-process up to four-component (H1, H2, Z and P), day-long + seismograms to use in noise corrections of vertical component of OBS data. Data are requested + from the internet using the client services framework for a given date range. The stations + are processed one by one and the data are stored to disk. positional arguments: indb Station Database to process from. - optional arguments: + options: -h, --help show this help message and exit - --keys STKEYS Specify a comma-separated list of station keys for - which to perform the analysis. These must be contained - within the station database. Partial keys will be used - to match against those in the dictionary. For - instance, providing IU will match with all stations in - the IU network. [Default processes all stations in the - database] - -C CHANNELS, --channels CHANNELS - Specify a comma-separated list of channels for which - to perform the transfer function analysis. Possible - options are '12' (for horizontal channels) or 'P' (for - pressure channel). Specifying '12' allows for tilt - correction. Specifying 'P' allows for compliance - correction. [Default looks for both horizontal and - pressure and allows for both tilt AND compliance - corrections] - --zcomp ZCOMP Specify the Vertical Component Channel Identifier. - [Default Z]. - -O, --overwrite Force the overwriting of pre-existing data. [Default - False] + --keys STKEYS Specify a comma-separated list of station keys for which to perform + the analysis. These must be contained within the station database. + Partial keys will be used to match against those in the dictionary. + For instance, providing IU will match with all stations in the IU + network. [Default processes all stations in the database] + -O, --overwrite Force the overwriting of pre-existing data. [Default False] + -Z ZCOMP, --zcomp ZCOMP + Specify the Vertical Component Channel Identifier. [Default Z]. Server Settings: Settings associated with which datacenter to log into. --server SERVER Base URL of FDSN web service compatible server (e.g. - “http://service.iris.edu”) or key string for - recognized server (one of 'AUSPASS', 'BGR', - 'EARTHSCOPE', 'EIDA', 'EMSC', 'ETH', 'GEOFON', - 'GEONET', 'GFZ', 'ICGC', 'IESDMC', 'INGV', 'IPGP', - 'IRIS', 'IRISPH5', 'ISC', 'KNMI', 'KOERI', 'LMU', - 'NCEDC', 'NIEP', 'NOA', 'NRCAN', 'ODC', 'ORFEUS', - 'RASPISHAKE', 'RESIF', 'RESIFPH5', 'SCEDC', 'TEXNET', - 'UIB-NORSAR', 'USGS', 'USP'). [Default 'IRIS'] - --user-auth USERAUTH Authentification Username and Password for the - waveform server (--user-auth='username:authpassword') - to access and download restricted data. [Default no - user and password] + “http://service.iris.edu”) or key string for recognized server (one + of 'AUSPASS', 'BGR', 'EARTHSCOPE', 'EIDA', 'EMSC', 'ETH', 'GEOFON', + 'GEONET', 'GFZ', 'ICGC', 'IESDMC', 'INGV', 'IPGP', 'IRIS', 'IRISPH5', + 'ISC', 'KNMI', 'KOERI', 'LMU', 'NCEDC', 'NIEP', 'NOA', 'NRCAN', + 'ODC', 'ORFEUS', 'RASPISHAKE', 'RESIF', 'RESIFPH5', 'SCEDC', + 'TEXNET', 'UIB-NORSAR', 'USGS', 'USP'). [Default 'IRIS'] + --user-auth USERAUTH Authentification Username and Password for the waveform server + (--user-auth='username:authpassword') to access and download + restricted data. [Default no user and password] --eida-token TOKENFILE - Token for EIDA authentication mechanism, see - http://geofon.gfz- - potsdam.de/waveform/archive/auth/index.php. If a token - is provided, argument --user-auth will be ignored. - This mechanism is only available on select EIDA nodes. - The token can be provided in form of the PGP message - as a string, or the filename of a local file with the + Token for EIDA authentication mechanism, see http://geofon.gfz- + potsdam.de/waveform/archive/auth/index.php. If a token is provided, + argument --user-auth will be ignored. This mechanism is only + available on select EIDA nodes. The token can be provided in form of + the PGP message as a string, or the filename of a local file with the PGP message in it. [Default None] + Local Data Settings: + Settings associated with defining and using a local data base of pre-downloaded day-long + SAC or MSEED files. + + --local-data LOCALDATA + Specify absolute path to a SeisComP Data Structure (SDS) archive + containing day-long SAC or MSEED files(e.g., --local- + data=/Home/username/Data/SDS). See + https://www.seiscomp.de/seiscomp3/doc/applications/slarchive/SDS.html + for details on the SDS format. If this option is used, it takes + precedence over the --server-wf settings. + --dtype DTYPE Specify the data archive file type, either SAC or MSEED. Note the + default behaviour is to search for SAC files. Local archive files + must have extensions of '.SAC' or '.MSEED'. These are case dependent, + so specify the correct case here. + Frequency Settings: Miscellaneous frequency settings --sampling-rate NEW_SAMPLING_RATE Specify new sampling rate (float, in Hz). [Default 5.] - --units UNITS Choose the output seismogram units. Options are: - 'DISP', 'VEL', 'ACC'. [Default 'DISP'] - --pre-filt PRE_FILT Specify four comma-separated corner frequencies - (float, in Hz) for deconvolution pre-filter. [Default - 0.001,0.005,45.,50.] + --units UNITS Choose the output seismogram units. Options are: 'DISP', 'VEL', + 'ACC'. [Default 'DISP'] + --pre-filt PRE_FILT Specify four comma-separated corner frequencies (float, in Hz) for + deconvolution pre-filter. [Default 0.001,0.005,45.,50.] Time Search Settings: Time settings associated with searching for day-long seismograms - --start STARTT Specify a UTCDateTime compatible string representing - the start day for the data search. This will override - any station start times. [Default start date for each - station in database] - --end ENDT Specify a UTCDateTime compatible string representing - the start time for the event search. This will - override any station end times [Default end date for - each station in database] - + --start STARTT Specify a UTCDateTime compatible string representing the start day + for the data search. This will override any station start times. + [Default start date for each station in database] + --end ENDT Specify a UTCDateTime compatible string representing the start time + for the event search. This will override any station end times + [Default end date for each station in database] ``atacr_daily_spectra`` +++++++++++++++++++++++ @@ -315,6 +320,17 @@ Usage .. code-block:: $ atacr_daily_spectra -h + + ################################################################# + # _ _ _ _ # + # __| | __ _(_) |_ _ ___ _ __ ___ ___| |_ _ __ __ _ # + # / _` |/ _` | | | | | | / __| '_ \ / _ \/ __| __| '__/ _` | # + # | (_| | (_| | | | |_| | \__ \ |_) | __/ (__| |_| | | (_| | # + # \__,_|\__,_|_|_|\__, |___|___/ .__/ \___|\___|\__|_| \__,_| # + # |___/_____| |_| # + # # + ################################################################# + usage: atacr_daily_spectra [options] Script used to extract shorter windows from the day-long seismograms, @@ -360,9 +376,11 @@ Usage (or 30 percent)] --minwin MINWIN Specify minimum number of 'good' windows in any given day to continue with analysis. [Default 10] - --freq-band PD Specify comma-separated frequency limits (float, in Hz) - over which to calculate spectral features used in - flagging the bad windows. [Default 0.004,2.0] + --flag-freqs PD Specify comma-separated frequency limits (float, in Hz) over which to + calculate spectral features used in flagging the bad windows. [Default + 0.004,2.0] + --tilt-freqs TF Specify comma-separated frequency limits (float, in Hz) over which to + calculate tilt. [Default 0.005,0.035] --tolerance TOL Specify parameter for tolerance threshold. If spectrum > std*tol, window is flagged as bad. [Default 2.0] --alpha ALPHA Specify confidence level for f-test, for iterative @@ -408,6 +426,17 @@ Usage .. code-block:: $ atacr_clean_spectra -h + + ################################################################### + # _ _ # + # ___| | ___ __ _ _ __ ___ _ __ ___ ___| |_ _ __ __ _ # + # / __| |/ _ \/ _` | '_ \ / __| '_ \ / _ \/ __| __| '__/ _` | # + # | (__| | __/ (_| | | | | \__ \ |_) | __/ (__| |_| | | (_| | # + # \___|_|\___|\__,_|_| |_|___|___/ .__/ \___|\___|\__|_| \__,_| # + # |_____| |_| # + # # + ################################################################### + usage: atacr_clean_spectra [options] Script used to extract daily spectra calculated from `obs_daily_spectra.py` @@ -443,9 +472,9 @@ Usage Parameter Settings: Miscellaneous default values and settings - --freq-band PD Specify comma-separated frequency limits (float, in Hz) - over which to calculate spectral features used in flagging - the days/windows. [Default 0.004,2.0] + --flag-freqs PD Specify comma-separated frequency limits (float, in Hz) over which to + calculate spectral features used in flagging the days/windows. [Default + 0.004,2.0] --tolerance TOL Specify parameter for tolerance threshold. If spectrum > std*tol, window is flagged as bad. [Default 1.5] --alpha ALPHA Confidence level for f-test, for iterative flagging of @@ -485,12 +514,23 @@ Usage .. code-block:: $ atacr_transfer_functions -h + + ####################################################################################### + # _ __ __ _ _ # + # | |_ _ __ __ _ _ __ ___ / _| ___ _ __ / _|_ _ _ __ ___| |_(_) ___ _ __ ___ # + # | __| '__/ _` | '_ \/ __| |_ / _ \ '__|| |_| | | | '_ \ / __| __| |/ _ \| '_ \/ __| # + # | |_| | | (_| | | | \__ \ _| __/ | | _| |_| | | | | (__| |_| | (_) | | | \__ \ # + # \__|_| \__,_|_| |_|___/_| \___|_|___|_| \__,_|_| |_|\___|\__|_|\___/|_| |_|___/ # + # |_____| # + # # + ####################################################################################### + usage: atacr_transfer_functions [options] Script used to calculate transfer functions between various components, to be used in cleaning vertical component of OBS data. The noise data can be those - obtained from the daily spectra (i.e., from `obs_daily_spectra.py`) or those - obtained from the averaged noise spectra (i.e., from `obs_clean_spectra.py`). + obtained from the daily spectra (i.e., from `atacr_daily_spectra`) or those + obtained from the averaged noise spectra (i.e., from `atacr_clean_spectra`). Flags are available to specify the source of data to use as well as the time range over which to calculate the transfer functions. The stations are processed one by one and the data are stored to disk. @@ -546,7 +586,7 @@ Usage Description ----------- -Downloads up to four-component (H1, H2, Z and P), two-hour-long seismograms +Downloads up to four-component (1,2,Z,H), two-hour-long seismograms for individual seismic events to use in noise corrections of vertical component data. Station selection is specified by a network and station code. The database is provided as a ``StDb`` dictionary. @@ -557,109 +597,111 @@ Usage .. code-block:: $ atacr_download_event -h + + ############################################################################### + # _ _ _ _ # + # __| | _____ ___ __ | | ___ __ _ __| | _____ _____ _ __ | |_ # + # / _` |/ _ \ \ /\ / / '_ \| |/ _ \ / _` |/ _` | / _ \ \ / / _ \ '_ \| __| # + # | (_| | (_) \ V V /| | | | | (_) | (_| | (_| | | __/\ V / __/ | | | |_ # + # \__,_|\___/ \_/\_/ |_| |_|_|\___/ \__,_|\__,_|___\___| \_/ \___|_| |_|\__| # + # |_____| # + # # + ############################################################################### + usage: atacr_download_event [options] - Script used to download and pre-process four-component (H1, H2, Z and P), two- - hour-long seismograms for individual events on which to apply the de-noising - algorithms. Data are requested from the internet using the client services - framework for a given date range. The stations are processed one by one and - the data are stored to disk. + Script used to download and pre-process four-component (H1, H2, Z and P), two-hour-long + seismograms for individual events on which to apply the de-noising algorithms. Data are + requested from the internet using the client services framework for a given date range. The + stations are processed one by one and the data are stored to disk. positional arguments: indb Station Database to process from. - optional arguments: + options: -h, --help show this help message and exit - --keys STKEYS Specify a comma separated list of station keys for - which to perform the analysis. These must be contained - within the station database. Partial keys will be used - to match against those in the dictionary. For - instance, providing IU will match with all stations in - the IU network [Default processes all stations in the - database] - -C CHANNELS, --channels CHANNELS - Specify a comma-separated list of channels for which - to perform the transfer function analysis. Possible - options are '12' (for horizontal channels) or 'P' (for - pressure channel). Specifying '12' allows for tilt - correction. Specifying 'P' allows for compliance - correction. [Default looks for both horizontal and - pressure and allows for both tilt AND compliance - corrections] - --zcomp ZCOMP Specify the Vertical Component Channel Identifier. - [Default Z]. - -O, --overwrite Force the overwriting of pre-existing data. [Default - False] + --keys STKEYS Specify a comma separated list of station keys for which to perform + the analysis. These must be contained within the station database. + Partial keys will be used to match against those in the dictionary. + For instance, providing IU will match with all stations in the IU + network [Default processes all stations in the database] + -O, --overwrite Force the overwriting of pre-existing data. [Default False] + --zcomp ZCOMP Specify the Vertical Component Channel Identifier. [Default Z]. Server Settings: Settings associated with which datacenter to log into. --server SERVER Base URL of FDSN web service compatible server (e.g. - “http://service.iris.edu”) or key string for - recognized server (one of 'AUSPASS', 'BGR', - 'EARTHSCOPE', 'EIDA', 'EMSC', 'ETH', 'GEOFON', - 'GEONET', 'GFZ', 'ICGC', 'IESDMC', 'INGV', 'IPGP', - 'IRIS', 'IRISPH5', 'ISC', 'KNMI', 'KOERI', 'LMU', - 'NCEDC', 'NIEP', 'NOA', 'NRCAN', 'ODC', 'ORFEUS', - 'RASPISHAKE', 'RESIF', 'RESIFPH5', 'SCEDC', 'TEXNET', - 'UIB-NORSAR', 'USGS', 'USP'). [Default 'IRIS'] - --user-auth USERAUTH Authentification Username and Password for the - waveform server (--user-auth='username:authpassword') - to access and download restricted data. [Default no - user and password] + “http://service.iris.edu”) or key string for recognized server (one + of 'AUSPASS', 'BGR', 'EARTHSCOPE', 'EIDA', 'EMSC', 'ETH', 'GEOFON', + 'GEONET', 'GFZ', 'ICGC', 'IESDMC', 'INGV', 'IPGP', 'IRIS', 'IRISPH5', + 'ISC', 'KNMI', 'KOERI', 'LMU', 'NCEDC', 'NIEP', 'NOA', 'NRCAN', + 'ODC', 'ORFEUS', 'RASPISHAKE', 'RESIF', 'RESIFPH5', 'SCEDC', + 'TEXNET', 'UIB-NORSAR', 'USGS', 'USP'). [Default 'IRIS'] + --user-auth USERAUTH Authentification Username and Password for the waveform server + (--user-auth='username:authpassword') to access and download + restricted data. [Default no user and password] --eida-token TOKENFILE - Token for EIDA authentication mechanism, see - http://geofon.gfz- - potsdam.de/waveform/archive/auth/index.php. If a token - is provided, argument --user-auth will be ignored. - This mechanism is only available on select EIDA nodes. - The token can be provided in form of the PGP message - as a string, or the filename of a local file with the + Token for EIDA authentication mechanism, see http://geofon.gfz- + potsdam.de/waveform/archive/auth/index.php. If a token is provided, + argument --user-auth will be ignored. This mechanism is only + available on select EIDA nodes. The token can be provided in form of + the PGP message as a string, or the filename of a local file with the PGP message in it. [Default None] - + + Local Data Settings: + Settings associated with defining and using a local data base of pre-downloaded day-long + SAC or MSEED files. + + --local-data LOCALDATA + Specify absolute path to a SeisComP Data Structure (SDS) archive + containing day-long SAC or MSEED files(e.g., --local- + data=/Home/username/Data/SDS). See + https://www.seiscomp.de/seiscomp3/doc/applications/slarchive/SDS.html + for details on the SDS format. If this option is used, it takes + precedence over the --server-wf settings. + --dtype DTYPE Specify the data archive file type, either SAC or MSEED. Note the + default behaviour is to search for SAC files. Local archive files + must have extensions of '.SAC' or '.MSEED'. These are case dependent, + so specify the correct case here. + Frequency Settings: Miscellaneous frequency settings --sampling-rate NEW_SAMPLING_RATE Specify new sampling rate (float, in Hz). [Default 5.] - --units UNITS Choose the output seismogram units. Options are: - 'DISP', 'VEL', 'ACC'. [Default 'DISP'] - --pre-filt PRE_FILT Specify four comma-separated corner frequencies - (float, in Hz) for deconvolution pre-filter. [Default - 0.001,0.005,45.,50.] - --window WINDOW Specify window length in seconds. Default value is - highly recommended. Program may not be stable for - large deviations from default value. [Default 7200. - (or 2 hours)] + --units UNITS Choose the output seismogram units. Options are: 'DISP', 'VEL', + 'ACC'. [Default 'DISP'] + --pre-filt PRE_FILT Specify four comma-separated corner frequencies (float, in Hz) for + deconvolution pre-filter. [Default 0.001,0.005,45.,50.] + --window WINDOW Specify window length in seconds. Default value is highly + recommended. Program may not be stable for large deviations from + default value. [Default 7200. (or 2 hours)] Event Settings: - Settings associated with refining the events to include in matching - station pairs - - --start STARTT Specify a UTCDateTime compatible string representing - the start time for the event search. This will - override any station start times. [Default start date - of each station in database] - --end ENDT Specify a UTCDateTime compatible string representing - the start time for the event search. This will - override any station end times [Default end date of - each station in database] - --reverse-order, -R Reverse order of events. Default behaviour starts at - oldest event and works towards most recent. Specify - reverse order and instead the program will start with - the most recent events and work towards older - --min-mag MINMAG Specify the minimum magnitude of event for which to - search. [Default 5.5] - --max-mag MAXMAG Specify the maximum magnitude of event for which to - search. [Default None, i.e. no limit] + Settings associated with refining the events to include in matching station pairs + + --start STARTT Specify a UTCDateTime compatible string representing the start time + for the event search. This will override any station start times. + [Default start date of each station in database] + --end ENDT Specify a UTCDateTime compatible string representing the start time + for the event search. This will override any station end times + [Default end date of each station in database] + --reverse-order, -R Reverse order of events. Default behaviour starts at oldest event and + works towards most recent. Specify reverse order and instead the + program will start with the most recent events and work towards older + --min-mag MINMAG Specify the minimum magnitude of event for which to search. [Default + 5.5] + --max-mag MAXMAG Specify the maximum magnitude of event for which to search. [Default + None, i.e. no limit] Geometry Settings: Settings associatd with the event-station geometries - --min-dist MINDIST Specify the minimum great circle distance (degrees) - between the station and event. [Default 30] - --max-dist MAXDIST Specify the maximum great circle distance (degrees) - between the station and event. [Default 120] + --min-dist MINDIST Specify the minimum great circle distance (degrees) between the + station and event. [Default 30] + --max-dist MAXDIST Specify the maximum great circle distance (degrees) between the + station and event. [Default 120] ``atacr_correct_event`` +++++++++++++++++++++++ @@ -678,13 +720,24 @@ Usage .. code-block:: $ atacr_correct_event -h + + ################################################################## + # _ _ # + # ___ ___ _ __ _ __ ___ ___| |_ _____ _____ _ __ | |_ # + # / __/ _ \| '__| '__/ _ \/ __| __| / _ \ \ / / _ \ '_ \| __| # + # | (_| (_) | | | | | __/ (__| |_ | __/\ V / __/ | | | |_ # + # \___\___/|_| |_| \___|\___|\__|___\___| \_/ \___|_| |_|\__| # + # |_____| # + # # + ################################################################## + usage: atacr_correct_event [options] Script used to extract transfer functions between various components, and use them to clean vertical component of OBS data for selected events. The noise data can be those obtained from the daily spectra (i.e., from - `obs_daily_spectra.py`) or those obtained from the averaged noise spectra - (i.e., from `obs_clean_spectra.py`). Flags are available to specify the source + `atacr_daily_spectra`) or those obtained from the averaged noise spectra + (i.e., from `atacr_clean_spectra`). Flags are available to specify the source of data to use as well as the time range for given events. The stations are processed one by one and the data are stored to disk in a new 'CORRECTED' folder. @@ -763,17 +816,7 @@ M08A and send the prompt to a logfile .. code-block:: - $ query_fdsn_stdb -N 7D -C ?H? -S M08A M08A > logfile - -.. note:: - - If you are using a *Z shell* (as opposed to a *Bourne Shell* or - other types of shells) on the terminal, - this command above will fail due to the presence of the question marks for - pattern matching. To find out, just type `ps -p $$`, which will return - `zsh` under the `CMD` field if you are using a Z shell. In this case, just - enclose the `?H?` in single or double quotes (e.g., - `query_fdsn_stdb.py -N 7D -C "?H?" -S M08A M08A > logfile`) + $ query_fdsn_stdb -N 7D -S M08A M08A > logfile To check the station info for M08A, use the program ``ls_stdb``: @@ -798,10 +841,9 @@ To check the station info for M08A, use the program ``ls_stdb``: ++++++++++++++++++++++ We wish to download one month of data for the station M08A for March 2012. -The various options above allow us to select the additional channels to use -(e.g., ``-C 12,P`` for both horizontal and pressure data - which is the default -setting). Default frequency settings for data pre-processing match those of -the Matlab ``ATaCR`` software and can therefore be ignored when calling the program. +The program will search for and download all available (1,2,Z,H) data. +Default frequency settings for data pre-processing match those of +the Matlab ``ATaCR`` software. Since the file ``M08A.pkl`` contains only one station, it is not necessary to specify a key. This option would be useful if the database contained several stations and we were only interested in downloading data for M08A. In this case, we would @@ -812,7 +854,7 @@ to specify the dates for which data will be downloaded. If you change your mind about the pre-processing options, you can always re-run the following line with the option ``-O``, which will overwrite the data saved to disk. -To download all broadband seismic and pressure data, simply type in a terminal: +To download all available data, simply type in a terminal: .. code-block:: @@ -822,7 +864,18 @@ An example log printed on the terminal will look like: .. code-block:: - Path to DATA/7D.M08A/ doesn`t exist - creating it + ########################################################################### + # _ _ _ _ _ # + # __| | _____ ___ __ | | ___ __ _ __| | __| | __ _| |_ __ _ # + # / _` |/ _ \ \ /\ / / '_ \| |/ _ \ / _` |/ _` | / _` |/ _` | __/ _` | # + # | (_| | (_) \ V V /| | | | | (_) | (_| | (_| | | (_| | (_| | || (_| | # + # \__,_|\___/ \_/\_/ |_| |_|_|\___/ \__,_|\__,_|___\__,_|\__,_|\__\__,_| # + # |_____| # + # # + ########################################################################### + + + Path to DATA/7D.M08A doesn't exist - creating it |===============================================| |===============================================| @@ -839,15 +892,25 @@ An example log printed on the terminal will look like: | Start: 2012-03-01 | | End: 2012-04-01 | - *********************************************************** - * Downloading day-long data for key 7D.M08A and day 2012.61 + ************************************************************ + * Downloading day-long data for key 7D.M08A and day 2012.061 * - * Channels selected: ['12', 'P'] and vertical - * 2012.061.*SAC - * -> Downloading Seismic data... + * 2012.061.*.SAC + * -> Downloading BH1 data... + * ...done + * -> Downloading BH2 data... + * ...done + * -> Downloading BHZ data... * ...done - * -> Downloading Pressure data... + * -> Downloading ?DH data... * ...done + * Start times are not all close to true start: + * BH1 2012-03-01T00:00:00.017800Z 2012-03-01T23:59:58.017800Z + * BH2 2012-03-01T00:00:00.017800Z 2012-03-01T23:59:58.017800Z + * BHZ 2012-03-01T00:00:00.017800Z 2012-03-01T23:59:58.017800Z + * BDH 2012-03-01T00:00:00.017800Z 2012-03-01T23:59:58.017800Z + * True start: 2012-03-01T00:00:00.000000Z + * -> Shifting traces to true start * -> Removing responses - Seismic data WARNING: FIR normalized: sum[coef]=9.767192E-01; WARNING: FIR normalized: sum[coef]=9.767192E-01; @@ -855,12 +918,13 @@ An example log printed on the terminal will look like: * -> Removing responses - Pressure data WARNING: FIR normalized: sum[coef]=9.767192E-01; - *********************************************************** - * Downloading day-long data for key 7D.M08A and day 2012.62 + ************************************************************ + * Downloading day-long data for key 7D.M08A and day 2012.062 * - * Channels selected: ['12', 'P'] and vertical - * 2012.062.*SAC - * -> Downloading Seismic data... + * 2012.062.*.SAC + * -> Downloading BH1 data... + * ...done + * -> Downloading BH2 data... ... @@ -873,7 +937,7 @@ This is where all day-long files will be stored on disk. For this step, there are several Parameter Settings that can be tuned. Once again, the default values are the ones used to reproduce the results of the Matlab -ATaCR software and can be left un-changed. The Time Search Settings can be used +ATaCR software. The Time Search Settings can be used to look at a subset of the available day-long data files. Here these options can be ignored since we wish to look at all the availble data that we just downloaded. We can therefore type in a terminal: @@ -882,6 +946,17 @@ We can therefore type in a terminal: $ atacr_daily_spectra M08A.pkl + ################################################################# + # _ _ _ _ # + # __| | __ _(_) |_ _ ___ _ __ ___ ___| |_ _ __ __ _ # + # / _` |/ _` | | | | | | / __| '_ \ / _ \/ __| __| '__/ _` | # + # | (_| | (_| | | | |_| | \__ \ |_) | __/ (__| |_| | | (_| | # + # \__,_|\__,_|_|_|\__, |___|___/ .__/ \___|\___|\__|_| \__,_| # + # |___/_____| |_| # + # # + ################################################################# + + Path to SPECTRA/7D.M08A/ doesn`t exist - creating it |===============================================| @@ -961,6 +1036,17 @@ therefore using all available data) and plot the results, we can type in a termi $ atacr_clean_spectra --figQC --figAverage --figCoh --figCross M08A.pkl + ################################################################### + # _ _ # + # ___| | ___ __ _ _ __ ___ _ __ ___ ___| |_ _ __ __ _ # + # / __| |/ _ \/ _` | '_ \ / __| '_ \ / _ \/ __| __| '__/ _` | # + # | (__| | __/ (_| | | | | \__ \ |_) | __/ (__| |_| | | (_| | # + # \___|_|\___|\__,_|_| |_|___|___/ .__/ \___|\___|\__|_| \__,_| # + # |_____| |_| # + # # + ################################################################### + + Path to AVG_STA/7D.M08A/ doesn`t exist - creating it |===============================================| @@ -1094,6 +1180,17 @@ In this case we do not need to specify any option and type in a terminal: $ atacr_transfer_functions M08A.pkl + ####################################################################################### + # _ __ __ _ _ # + # | |_ _ __ __ _ _ __ ___ / _| ___ _ __ / _|_ _ _ __ ___| |_(_) ___ _ __ ___ # + # | __| '__/ _` | '_ \/ __| |_ / _ \ '__|| |_| | | | '_ \ / __| __| |/ _ \| '_ \/ __| # + # | |_| | | (_| | | | \__ \ _| __/ | | _| |_| | | | | (__| |_| | (_) | | | \__ \ # + # \__|_| \__,_|_| |_|___/_| \___|_|___|_| \__,_|_| |_|\___|\__|_|\___/|_| |_|___/ # + # |_____| # + # # + ####################################################################################### + + Path to TF_STA/7D.M08A/ doesn't exist - creating it |===============================================| @@ -1155,8 +1252,19 @@ Vanuatu earthquake (be conservative with the options), type in a terminal: $ atacr_download_event --min-mag=6.3 --max-mag=6.7 --start=2012-03-08 --end=2012-03-10 M08A.pkl - Path to EVENTS/7D.M08A/ doesn`t exist - creating it - + ############################################################################### + # _ _ _ _ # + # __| | _____ ___ __ | | ___ __ _ __| | _____ _____ _ __ | |_ # + # / _` |/ _ \ \ /\ / / '_ \| |/ _ \ / _` |/ _` | / _ \ \ / / _ \ '_ \| __| # + # | (_| | (_) \ V V /| | | | | (_) | (_| | (_| | | __/\ V / __/ | | | |_ # + # \__,_|\___/ \_/\_/ |_| |_|_|\___/ \__,_|\__,_|___\___| \_/ \___|_| |_|\__| # + # |_____| # + # # + ############################################################################### + + + Path to EVENTS/7D.M08A doesn't exist - creating it + |===============================================| |===============================================| | M08A | @@ -1165,8 +1273,8 @@ Vanuatu earthquake (be conservative with the options), type in a terminal: | Station: 7D.M08A | | Channel: BH; Locations: -- | | Lon: -124.90; Lat: 44.12 | - | Start time: 2011-10-20 00:00:00 | - | End time: 2012-07-18 23:59:59 | + | Start time: 2011-10-20 23:50:00 | + | End time: 2012-07-18 23:06:00 | |-----------------------------------------------| | Searching Possible events: | | Start: 2012-03-08 00:00:00 | @@ -1174,27 +1282,36 @@ Vanuatu earthquake (be conservative with the options), type in a terminal: | Mag: 6.3 - 6.7 | | ... | | Found 1 possible events | - - **************************************************** - * (1/1): 20120309_070953 + + ************************************************************ + * #(1/1): 20120309_070953 * Origin Time: 2012-03-09 07:09:53 * Lat: -19.22; Lon: 169.75 * Dep: 33.70; Mag: 6.6 - * M08A -> Ev: 9651.91 km; 86.80 deg; 239.43; 40.95 - - * Channels selected: ['12', 'P'] and vertical - * 2012.069.07.09 - * -> Downloading Seismic data... + * Dist: 9651.91 km; 86.80 deg + * 2012.069.07.09*.SAC + * -> Downloading BH1 data... + * ...done + * -> Downloading BH2 data... + * ...done + * -> Downloading BHZ data... + * ...done + * -> Downloading ?DH data... * ...done - * -> Downloading Pressure data... - ...done + * Start times are not all close to true start: + * BH1 2012-03-09T07:09:53.336000Z 2012-03-09T09:09:52.336000Z + * BH2 2012-03-09T07:09:53.336000Z 2012-03-09T09:09:52.336000Z + * BHZ 2012-03-09T07:09:53.336000Z 2012-03-09T09:09:52.336000Z + * BDH 2012-03-09T07:09:53.336000Z 2012-03-09T09:09:52.336000Z + * True start: 2012-03-09T07:09:53.320000Z + * -> Shifting traces to true start * -> Removing responses - Seismic data WARNING: FIR normalized: sum[coef]=9.767192E-01; WARNING: FIR normalized: sum[coef]=9.767192E-01; WARNING: FIR normalized: sum[coef]=9.767192E-01; * -> Removing responses - Pressure data - WARNING: FIR normalized: sum[coef]=9.767192E-01; - + WARNING: FIR normalized: sum[coef]=9.767192E-01; + The data are stored as an ``EventStream`` object, saved to disk in the newly created folder ``EVENTS/7D.M08A/``. @@ -1211,7 +1328,18 @@ the final Figures 11 and 12, specify the ``--figRaw`` and ``--figClean`` options $ atacr_correct_event --figRaw --figClean M08A.pkl - + ################################################################## + # _ _ # + # ___ ___ _ __ _ __ ___ ___| |_ _____ _____ _ __ | |_ # + # / __/ _ \| '__| '__/ _ \/ __| __| / _ \ \ / / _ \ '_ \| __| # + # | (_| (_) | | | | | __/ (__| |_ | __/\ V / __/ | | | |_ # + # \___\___/|_| |_| \___|\___|\__|___\___| \_/ \___|_| |_|\__| # + # |_____| # + # # + ################################################################## + + + |===============================================| |===============================================| | M08A | @@ -1220,8 +1348,8 @@ the final Figures 11 and 12, specify the ``--figRaw`` and ``--figClean`` options | Station: 7D.M08A | | Channel: BH; Locations: -- | | Lon: -124.90; Lat: 44.12 | - | Start time: 2011-10-20 00:00:00 | - | End time: 2012-07-18 23:59:59 | + | Start time: 2011-10-20 23:50:00 | + | End time: 2012-07-18 23:06:00 | |-----------------------------------------------| TF_STA/7D.M08A/2011.293-2012.200.transfunc.pkl file found - applying transfer functions TF_STA/7D.M08A/2012.069.transfunc.pkl file found - applying transfer functions diff --git a/docs/comply.rst b/docs/comply.rst index 95270ce..babedba 100644 --- a/docs/comply.rst +++ b/docs/comply.rst @@ -103,6 +103,17 @@ Usage .. code-block:: $ comply_calculate -h + + ################################################################################# + # _ _ _ _ # + # ___ ___ _ __ ___ _ __ | |_ _ ___ __ _| | ___ _ _| | __ _| |_ ___ # + # / __/ _ \| '_ ` _ \| '_ \| | | | | / __/ _` | |/ __| | | | |/ _` | __/ _ \ # + # | (_| (_) | | | | | | |_) | | |_| | | (_| (_| | | (__| |_| | | (_| | || __/ # + # \___\___/|_| |_| |_| .__/|_|\__, |___\___\__,_|_|\___|\__,_|_|\__,_|\__\___| # + # |_| |___/_____| # + # # + ################################################################################# + usage: comply_calculate [options] Script used to calculate compliance functions between various components. The noise data can be those obtained @@ -189,6 +200,17 @@ In this case we do not need to specify any option and type in a terminal: $ comply_calculate M08A.pkl + ################################################################################# + # _ _ _ _ # + # ___ ___ _ __ ___ _ __ | |_ _ ___ __ _| | ___ _ _| | __ _| |_ ___ # + # / __/ _ \| '_ ` _ \| '_ \| | | | | / __/ _` | |/ __| | | | |/ _` | __/ _ \ # + # | (_| (_) | | | | | | |_) | | |_| | | (_| (_| | | (__| |_| | | (_| | || __/ # + # \___\___/|_| |_| |_| .__/|_|\__, |___\___\__,_|_|\___|\__,_|_|\__,_|\__\___| # + # |_| |___/_____| # + # # + ################################################################################# + + Path to COMPL_STA/7D.M08A doesn't exist - creating it diff --git a/docs/conf.py b/docs/conf.py index c02dc6d..0fd72d6 100644 --- a/docs/conf.py +++ b/docs/conf.py @@ -29,7 +29,7 @@ # Add any Sphinx extension module names here, as strings. They can be # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom # ones. -extensions = ['sphinx.ext.autodoc', 'sphinx.ext.napoleon'] +extensions = ['sphinx.ext.autodoc', 'sphinx.ext.napoleon', 'sphinx.ext.viewcode'] autodoc_member_order = 'bysource' diff --git a/docs/index.rst b/docs/index.rst index 9a44af8..309b3e3 100644 --- a/docs/index.rst +++ b/docs/index.rst @@ -16,8 +16,6 @@ can be used through command-line scripts. :target: https://doi.org/10.5281/zenodo.4281480 .. image:: https://github.com/nfsi-canada/OBStools/workflows/Build/badge.svg :target: https://github.com/nfsi-canada/OBStools/actions -.. image:: https://codecov.io/gh/nfsi-canada/OBStools/branch/master/graph/badge.svg - :target: https://codecov.io/gh/nfsi-canada/OBStools .. toctree:: :maxdepth: 1 diff --git a/meson.build b/meson.build index 11064d7..86e0fc4 100644 --- a/meson.build +++ b/meson.build @@ -1,5 +1,5 @@ project('rfpy', 'c', - version : '0.2.0', + version : '0.2.1', license: 'MIT', meson_version: '>=0.64.0', ) diff --git a/obstools/__init__.py b/obstools/__init__.py index a1bdcf2..1518578 100644 --- a/obstools/__init__.py +++ b/obstools/__init__.py @@ -24,7 +24,7 @@ Licence ------- -Copyright 2019 Pascal Audet & Helen Janiszewski +Copyright 2019 Pascal Audet Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal @@ -50,19 +50,14 @@ Dependencies ++++++++++++ -The current version has been tested using **Python 3.10** \ -The following package is required before install ``obstools``: - -- `stdb `_ - -Other required packages (e.g., ``obspy``) -will be automatically installed by ``stdb``. +- `stdb `_ +- `obspy `_ Conda environment +++++++++++++++++ We recommend creating a custom ``conda`` environment -where ``OBStools`` can be installed along with some of its dependencies. +where ``OBStools`` can be installed along with its dependencies. .. sourcecode:: bash @@ -78,14 +73,14 @@ .. sourcecode:: bash - pip install stdb + pip install git+https://github.com/schaefferaj/stdb Installing development branch on GitHub +++++++++++++++++++++++++++++++++++++++ .. sourcecode:: bash - pip install obstools@git+https://github.com/nfsi-canada/obstools + pip install git+https://github.com/nfsi-canada/obstools Installing from source ++++++++++++++++++++++ @@ -103,11 +98,75 @@ pip install . +Using local data +---------------- + +The various scripts packaged with ``OrientPy`` use FDSN web services +through and ``ObsPy`` `Client` to load waveform data. For waveform +data locally stored on your hard drive, the scripts can use a `Client` +that reads a `SeisComP Data Structure +`_ +archive containing SAC or miniSEED waveform data. Check out the scripts +``bng_calc`` and ``dl_calc`` below and the argument ``--local-data`` and +``--dtype`` for more details. + +Station Metadata +++++++++++++++++ + +If you have data stored locally on your drive, it is likely you also +have a station `XML `_ file +containing the metadata. The corresponding ObsPy documentation is +`here `_. + +To convert the station `XML` file to an input that can be read by +``OrientPy``, you run the command ``gen_stdb station.xml`` (only +available on StDb version 0.2.7), which will create the file +``station.pkl``. If you don't have a station `XML` file but you have +a dataless SEED file, you can convert it first to `XML` using `this +tools `_. + +Waveform Data ++++++++++++++ + +The SDS folder containing the waveform data has the structure: + +.. code-block:: python + + archive + + year + + network code + + station code + + channel code + type + + one file per day and location, e.g. NET.STA.LOC.CHAN.TYPE.YEAR.DOY + + +For example: + +.. code-block:: python + + SDS/ + 2014/ + YH/ + LOBS3/ + HH1.D/ + YH.LOBS3..CH1.D.2014.332 + ... + + +Note, the filename does not include the extension (`.MSEED` or `.SAC`), +and the characters `.D` (for type Data) that appear in both the +channel code and the filename. Note also the two dots (`..`). If +there is a location code, it should appear between those dots (e.g., +for a location code `10`, the corresponding filename should be +`YH.LOBS3.10.HH1.D.2014.332`). There is no location code for the +YH.LOBS3 data, and this field is simply absent from the filenames. +Finally, the day-of-year (DOY) field must be zero-padded to be exactly +3 characters. """ -__version__ = '0.2.0' +__version__ = '0.2.1' -__author__ = 'Pascal Audet & Helen Janiszewski' +__author__ = 'Pascal Audet' from . import atacr from . import comply diff --git a/obstools/atacr/classes.py b/obstools/atacr/classes.py index 6473e55..f4e967e 100644 --- a/obstools/atacr/classes.py +++ b/obstools/atacr/classes.py @@ -110,13 +110,13 @@ class Rotation(object): Maximum coherence phase_value : float Phase at maximum coherence - direc : :class:`~numpy.ndarray` + phi : :class:`~numpy.ndarray` Directions for which the coherence is calculated """ def __init__(self, cHH=None, cHZ=None, cHP=None, coh=None, ph=None, - tilt=None, coh_value=None, phase_value=None, direc=None): + tilt=None, coh_value=None, phase_value=None, phi=None): self.cHH = cHH self.cHZ = cHZ @@ -126,7 +126,7 @@ def __init__(self, cHH=None, cHZ=None, cHP=None, coh=None, ph=None, self.tilt = tilt self.coh_value = coh_value self.phase_value = phase_value - self.direc = direc + self.phi = phi class DayNoise(object): @@ -519,34 +519,34 @@ def QC_daily_spectra(self, pd=[0.004, 0.2], tol=1.5, alpha=0.05, dsl_psdP = sl_psdP[ff, :] - np.mean(sl_psdP[ff, :], axis=0) dsls = [dsl_psd1, dsl_psd2, dsl_psdZ, dsl_psdP] - if self.ncomp == 2: - plt.figure(2) - plt.subplot(2, 1, 1) - plt.semilogx(f, sl_psdZ, 'g', lw=0.5) - plt.subplot(2, 1, 2) - plt.semilogx(f, sl_psdP, 'k', lw=0.5) - plt.tight_layout() - elif self.ncomp == 3: - plt.figure(2) - plt.subplot(3, 1, 1) - plt.semilogx(f, sl_psd1, 'r', lw=0.5) - plt.subplot(3, 1, 2) - plt.semilogx(f, sl_psd2, 'b', lw=0.5) - plt.subplot(3, 1, 3) - plt.semilogx(f, sl_psdZ, 'g', lw=0.5) - plt.tight_layout() - else: - plt.figure(2) - plt.subplot(4, 1, 1) - plt.semilogx(f, sl_psd1, 'r', lw=0.5) - plt.subplot(4, 1, 2) - plt.semilogx(f, sl_psd2, 'b', lw=0.5) - plt.subplot(4, 1, 3) - plt.semilogx(f, sl_psdZ, 'g', lw=0.5) - plt.subplot(4, 1, 4) - plt.semilogx(f, sl_psdP, 'k', lw=0.5) - plt.tight_layout() if debug: + if self.ncomp == 2: + plt.figure(2) + plt.subplot(2, 1, 1) + plt.semilogx(f, sl_psdZ, 'g', lw=0.5) + plt.subplot(2, 1, 2) + plt.semilogx(f, sl_psdP, 'k', lw=0.5) + plt.tight_layout() + elif self.ncomp == 3: + plt.figure(2) + plt.subplot(3, 1, 1) + plt.semilogx(f, sl_psd1, 'r', lw=0.5) + plt.subplot(3, 1, 2) + plt.semilogx(f, sl_psd2, 'b', lw=0.5) + plt.subplot(3, 1, 3) + plt.semilogx(f, sl_psdZ, 'g', lw=0.5) + plt.tight_layout() + else: + plt.figure(2) + plt.subplot(4, 1, 1) + plt.semilogx(f, sl_psd1, 'r', lw=0.5) + plt.subplot(4, 1, 2) + plt.semilogx(f, sl_psd2, 'b', lw=0.5) + plt.subplot(4, 1, 3) + plt.semilogx(f, sl_psdZ, 'g', lw=0.5) + plt.subplot(4, 1, 4) + plt.semilogx(f, sl_psdP, 'k', lw=0.5) + plt.tight_layout() plt.show() # Cycle through to kill high-std-norm windows @@ -616,8 +616,9 @@ def QC_daily_spectra(self, pd=[0.004, 0.2], tol=1.5, alpha=0.05, self.QC = True - def average_daily_spectra(self, calc_rotation=True, fig_average=False, - fig_coh_ph=False, save=None, form='png'): + def average_daily_spectra(self, calc_rotation=True, tiltfreqs=[0.005, 0.035], + fig_average=False, fig_coh_ph=False, save=None, + form='png'): """ Method to average the daily spectra for good windows. By default, the method will attempt to calculate the azimuth of maximum coherence @@ -629,6 +630,9 @@ def average_daily_spectra(self, calc_rotation=True, fig_average=False, ---------- calc_rotation : boolean Whether or not to calculate the tilt direction + tiltfreqs : list + Two floats representing the frequency band at which the tilt is + calculated fig_average : boolean Whether or not to produce a figure showing the average daily spectra @@ -774,15 +778,15 @@ def average_daily_spectra(self, calc_rotation=True, fig_average=False, plot.show() if calc_rotation and self.ncomp >= 3: - cHH, cHZ, cHP, coh, ph, direc, tilt, coh_value, phase_value = \ + cHH, cHZ, cHP, coh, ph, phi, tilt, coh_value, phase_value = \ utils.calculate_tilt( self.ft1, self.ft2, self.ftZ, self.ftP, self.f, - self.goodwins) + self.goodwins, tiltfreqs) self.rotation = Rotation( - cHH, cHZ, cHP, coh, ph, tilt, coh_value, phase_value, direc) + cHH, cHZ, cHP, coh, ph, tilt, coh_value, phase_value, phi) if fig_coh_ph: - plot = plotting.fig_coh_ph(coh, ph, direc) + plot = plotting.fig_coh_ph(coh, ph, phi) # Save or show figure if save: @@ -1018,7 +1022,7 @@ def init(self): are available for the power, cross and rotated spectra: [11, 12, 1Z, 1P, 22, 2Z, 2P, ZZ, ZP, PP, HH, HZ, HP] - direc : `numpy.ndarray` + phi : `numpy.ndarray` Array of azimuths used in determining the tilt direction tilt : float Tilt direction from maximum coherence between rotated `H1` and @@ -1050,7 +1054,7 @@ def init(self): AttributeError: 'StaNoise' object has no attribute 'daylist' >>> stanoise.__dict__.keys() dict_keys(['initialized', 'c11', 'c22', 'cZZ', 'cPP', 'c12', 'c1Z', - 'c1P', 'c2Z', 'c2P', 'cZP', 'cHH', 'cHZ', 'cHP', 'direc', 'tilt', 'f', + 'c1P', 'c2Z', 'c2P', 'cZP', 'cHH', 'cHZ', 'cHP', 'phi', 'tilt', 'f', 'nwins', 'ncomp', 'key', 'tf_list', 'QC', 'av']) """ @@ -1082,7 +1086,7 @@ def init(self): self.cHH = np.array([dn.rotation.cHH for dn in self.daylist]).T self.cHZ = np.array([dn.rotation.cHZ for dn in self.daylist]).T self.cHP = np.array([dn.rotation.cHP for dn in self.daylist]).T - self.direc = self.daylist[0].rotation.direc + self.phi = self.daylist[0].rotation.phi self.tilt = self.daylist[0].rotation.tilt self.f = self.daylist[0].f self.nwins = np.array([np.sum(dn.goodwins) for dn in self.daylist]) @@ -1193,34 +1197,34 @@ def QC_sta_spectra(self, pd=[0.004, 0.2], tol=2.0, alpha=0.05, dsl_cPP = sl_cPP[ff, :] - np.mean(sl_cPP[ff, :], axis=0) dsls = [dsl_c11, dsl_c22, dsl_cZZ, dsl_cPP] - if self.ncomp == 2: - plt.figure(2) - plt.subplot(2, 1, 1) - plt.semilogx(self.f[faxis], sl_cZZ[faxis], 'g', lw=0.5) - plt.subplot(2, 1, 2) - plt.semilogx(self.f[faxis], sl_cPP[faxis], 'k', lw=0.5) - plt.tight_layout() - elif self.ncomp == 3: - plt.figure(2) - plt.subplot(3, 1, 1) - plt.semilogx(self.f[faxis], sl_c11[faxis], 'r', lw=0.5) - plt.subplot(3, 1, 2) - plt.semilogx(self.f[faxis], sl_c22[faxis], 'b', lw=0.5) - plt.subplot(3, 1, 3) - plt.semilogx(self.f[faxis], sl_cZZ[faxis], 'g', lw=0.5) - plt.tight_layout() - else: - plt.figure(2) - plt.subplot(4, 1, 1) - plt.semilogx(self.f[faxis], sl_c11[faxis], 'r', lw=0.5) - plt.subplot(4, 1, 2) - plt.semilogx(self.f[faxis], sl_c22[faxis], 'b', lw=0.5) - plt.subplot(4, 1, 3) - plt.semilogx(self.f[faxis], sl_cZZ[faxis], 'g', lw=0.5) - plt.subplot(4, 1, 4) - plt.semilogx(self.f[faxis], sl_cPP[faxis], 'k', lw=0.5) - plt.tight_layout() if debug: + if self.ncomp == 2: + plt.figure(2) + plt.subplot(2, 1, 1) + plt.semilogx(self.f[faxis], sl_cZZ[faxis], 'g', lw=0.5) + plt.subplot(2, 1, 2) + plt.semilogx(self.f[faxis], sl_cPP[faxis], 'k', lw=0.5) + plt.tight_layout() + elif self.ncomp == 3: + plt.figure(2) + plt.subplot(3, 1, 1) + plt.semilogx(self.f[faxis], sl_c11[faxis], 'r', lw=0.5) + plt.subplot(3, 1, 2) + plt.semilogx(self.f[faxis], sl_c22[faxis], 'b', lw=0.5) + plt.subplot(3, 1, 3) + plt.semilogx(self.f[faxis], sl_cZZ[faxis], 'g', lw=0.5) + plt.tight_layout() + else: + plt.figure(2) + plt.subplot(4, 1, 1) + plt.semilogx(self.f[faxis], sl_c11[faxis], 'r', lw=0.5) + plt.subplot(4, 1, 2) + plt.semilogx(self.f[faxis], sl_c22[faxis], 'b', lw=0.5) + plt.subplot(4, 1, 3) + plt.semilogx(self.f[faxis], sl_cZZ[faxis], 'g', lw=0.5) + plt.subplot(4, 1, 4) + plt.semilogx(self.f[faxis], sl_cPP[faxis], 'k', lw=0.5) + plt.tight_layout() plt.show() # Cycle through to kill high-std-norm windows diff --git a/obstools/atacr/utils.py b/obstools/atacr/utils.py index 59b47cd..c1f5afa 100644 --- a/obstools/atacr/utils.py +++ b/obstools/atacr/utils.py @@ -365,7 +365,7 @@ def get_event(eventpath, tstart, tend): return tr1, tr2, trZ, trP -def calculate_tilt(ft1, ft2, ftZ, ftP, f, goodwins, tiltfreq=[0.005, 0.035]): +def calculate_tilt(ft1, ft2, ftZ, ftP, f, goodwins, tiltfreqs): """ Determines tilt direction from maximum coherence between rotated H1 and Z. @@ -379,7 +379,7 @@ def calculate_tilt(ft1, ft2, ftZ, ftP, f, goodwins, tiltfreq=[0.005, 0.035]): List of booleans representing whether a window is good (True) or not (False). This attribute is returned from the method :func:`~obstools.atacr.classes.DayNoise.QC_daily_spectra` - tiltfreq : list, optional + tiltfreqs : list Two floats representing the frequency band at which the tilt is calculated @@ -394,7 +394,7 @@ def calculate_tilt(ft1, ft2, ftZ, ftP, f, goodwins, tiltfreq=[0.005, 0.035]): ph : :class:`~numpy.ndarray` Phase value between rotated H and Z components, as a function of directions (azimuths) - direc : :class:`~numpy.ndarray` + phi : :class:`~numpy.ndarray` Array of directions (azimuths) considered tilt : float Direction (azimuth) of maximum coherence between rotated H1 and Z @@ -405,16 +405,18 @@ def calculate_tilt(ft1, ft2, ftZ, ftP, f, goodwins, tiltfreq=[0.005, 0.035]): """ - direc = np.arange(0., 360., 10.) - coh = np.zeros(len(direc)) - ph = np.zeros(len(direc)) - cZZ = np.abs(np.mean(ftZ[goodwins, :] * - np.conj(ftZ[goodwins, :]), axis=0))[0:len(f)] + phi = np.arange(0., 360., 10.) + coh = np.zeros(len(phi)) + ph = np.zeros(len(phi)) + cZZ = np.abs( + np.mean(ftZ[goodwins, :] * \ + np.conj(ftZ[goodwins, :]), axis=0))[0:len(f)] - for i, d in enumerate(direc): + for i, d in enumerate(phi): - # Rotate horizontals - ftH = rotate_dir(ft1, ft2, d) + # Rotate horizontals - clockwise from 1 + # components 1, 2 correspond to y, x in the cartesian plane + ftH = rotate_dir(ft2, ft1, d) # Get transfer functions cHH = np.abs(np.mean(ftH[goodwins, :] * @@ -426,8 +428,8 @@ def calculate_tilt(ft1, ft2, ftZ, ftP, f, goodwins, tiltfreq=[0.005, 0.035]): Ph = phase(cHZ) # Calculate coherence over frequency band - coh[i] = np.mean(Co[(f > tiltfreq[0]) & (f < tiltfreq[1])]) - ph[i] = np.pi/2. - np.mean(Ph[(f > tiltfreq[0]) & (f < tiltfreq[1])]) + coh[i] = np.mean(Co[(f > tiltfreqs[0]) & (f < tiltfreqs[1])]) + ph[i] = np.mean(Ph[(f > tiltfreqs[0]) & (f < tiltfreqs[1])]) # Index where coherence is max ind = np.argwhere(coh == coh.max()) @@ -435,17 +437,34 @@ def calculate_tilt(ft1, ft2, ftZ, ftP, f, goodwins, tiltfreq=[0.005, 0.035]): # Phase and direction at maximum coherence phase_value = ph[ind[0]][0] coh_value = coh[ind[0]][0] - tilt = direc[ind[0]][0] + tilt = phi[ind[0]][0] + + # Phase has to be close to zero: + # Check phase std near max coherence + start = max(0, ind[0][0] - 1) + end = min(len(ph), ind[0][0] + 2) + phase_std = np.std(ph[start:end]) + # print('Tilt at Maximum coherence = ', tilt) + # print('Phase at Maximum coherence = ', phase_value) + # print('Phase std near Maximum coherence = ', phase_std) + + if phase_std > 0.1: + tilt += 180. + if tilt > 360.: + tilt -= 360. + + # print('Tilt corrected = ', tilt) # Refine search - rdirec = np.arange(direc[ind[0]][0]-10., direc[ind[0]][0]+10., 1.) - rcoh = np.zeros(len(direc)) - rph = np.zeros(len(direc)) + rphi = np.arange(tilt-10., tilt+10., 1.) + rcoh = np.zeros(len(rphi)) + rph = np.zeros(len(rphi)) - for i, d in enumerate(rdirec): + for i, d in enumerate(rphi): - # Rotate horizontals - ftH = rotate_dir(ft1, ft2, d) + # Rotate horizontals - clockwise from 1 + # components 1, 2 correspond to y, x in the cartesian plane + ftH = rotate_dir(ft2, ft1, d) # Get transfer functions cHH = np.abs(np.mean(ftH[goodwins, :] * @@ -457,8 +476,8 @@ def calculate_tilt(ft1, ft2, ftZ, ftP, f, goodwins, tiltfreq=[0.005, 0.035]): Ph = phase(cHZ) # Calculate coherence over frequency band - rcoh[i] = np.mean(Co[(f > tiltfreq[0]) & (f < tiltfreq[1])]) - rph[i] = np.pi/2. - np.mean(Ph[(f > tiltfreq[0]) & (f < tiltfreq[1])]) + rcoh[i] = np.mean(Co[(f > tiltfreqs[0]) & (f < tiltfreqs[1])]) + rph[i] = np.mean(Ph[(f > tiltfreqs[0]) & (f < tiltfreqs[1])]) # Index where coherence is max ind = np.argwhere(rcoh == rcoh.max()) @@ -466,15 +485,7 @@ def calculate_tilt(ft1, ft2, ftZ, ftP, f, goodwins, tiltfreq=[0.005, 0.035]): # Phase and direction at maximum coherence phase_value = rph[ind[0]][0] coh_value = rcoh[ind[0]][0] - tilt = rdirec[ind[0]][0] - - # Phase has to be close to zero - otherwise add pi - if phase_value > 0.5*np.pi: - tilt += 180. - if tilt > 360.: - tilt -= 360. - - # print('Maximum coherence for tilt = ', tilt) + tilt = rphi[ind[0]][0] # Now calculate spectra at tilt direction ftH = rotate_dir(ft1, ft2, tilt) @@ -489,7 +500,7 @@ def calculate_tilt(ft1, ft2, ftZ, ftP, f, goodwins, tiltfreq=[0.005, 0.035]): else: cHP = None - return cHH, cHZ, cHP, coh, ph, direc, tilt, coh_value, phase_value + return cHH, cHZ, cHP, coh, ph, phi, tilt, coh_value, phase_value def smooth(data, nd, axis=0): @@ -601,16 +612,16 @@ def phase(Gxy): return None -def rotate_dir(tr1, tr2, direc): +def rotate_dir(x, y, theta): - d = -direc*np.pi/180.+np.pi/2. - rot_mat = np.array([[np.cos(d), -np.sin(d)], - [np.sin(d), np.cos(d)]]) + d = -theta*np.pi/180. + rot_mat = [[np.cos(d), -np.sin(d)], + [np.sin(d), np.cos(d)]] - v12 = np.array([tr2, tr1]) - vxy = np.tensordot(rot_mat, v12, axes=1) - tr_2 = vxy[0, :] - tr_1 = vxy[1, :] + vxy = [x, y] + vxy_rotated = np.tensordot(rot_mat, vxy, axes=1) + tr_2 = vxy_rotated[0] + tr_1 = vxy_rotated[1] return tr_1 diff --git a/obstools/scripts/atacr_clean_spectra.py b/obstools/scripts/atacr_clean_spectra.py index f71983d..8a067b1 100644 --- a/obstools/scripts/atacr_clean_spectra.py +++ b/obstools/scripts/atacr_clean_spectra.py @@ -27,24 +27,19 @@ import numpy as np import pickle import stdb +import copy + +from obspy import UTCDateTime + from obstools.atacr import StaNoise, Power, Cross, Rotation from obstools.atacr import utils, plotting -from pathlib import Path +from pathlib import Path from argparse import ArgumentParser from os.path import exists as exist -from obspy import UTCDateTime -from numpy import nan def get_cleanspec_arguments(argv=None): - """ - Get Options from :class:`~optparse.OptionParser` objects. - - Calling options for the script `obs_clean_spectra.py` that accompany this - package. - - """ parser = ArgumentParser( usage="%(prog)s [options] ", @@ -114,7 +109,7 @@ def get_cleanspec_arguments(argv=None): description="Miscellaneous default values " + "and settings") ConstGroup.add_argument( - "--freq-band", + "--flag-freqs", action="store", type=str, dest="pd", @@ -243,6 +238,18 @@ def get_cleanspec_arguments(argv=None): def main(args=None): + print() + print("###################################################################") + print("# _ _ #") + print("# ___| | ___ __ _ _ __ ___ _ __ ___ ___| |_ _ __ __ _ #") + print("# / __| |/ _ \/ _` | '_ \ / __| '_ \ / _ \/ __| __| '__/ _` | #") + print("# | (__| | __/ (_| | | | | \__ \ |_) | __/ (__| |_| | | (_| | #") + print("# \___|_|\___|\__,_|_| |_|___|___/ .__/ \___|\___|\__|_| \__,_| #") + print("# |_____| |_| #") + print("# #") + print("###################################################################") + print() + if args is None: # Run Input Parser args = get_cleanspec_arguments() @@ -312,13 +319,12 @@ def main(args=None): continue # Temporary print locations - tlocs = sta.location + tlocs = copy.copy(sta.location) if len(tlocs) == 0: tlocs = [''] for il in range(0, len(tlocs)): if len(tlocs[il]) == 0: - tlocs[il] = "--" - sta.location = tlocs + tlocs.append("--") # Update Display print("\n|===============================================|") @@ -566,9 +572,9 @@ def main(args=None): else: plot.show() - if args.fig_coh_ph and stanoise.direc is not None: + if args.fig_coh_ph and stanoise.phi is not None: fname = stkey + '.' + 'coh_ph' - plot = plotting.fig_coh_ph(coh_all, ph_all, stanoise.direc) + plot = plotting.fig_coh_ph(coh_all, ph_all, stanoise.phi) if plotpath: plot.savefig( str(plotpath / (fname + '.' + args.form)), diff --git a/obstools/scripts/atacr_correct_event.py b/obstools/scripts/atacr_correct_event.py index 0f00197..8ccd3c7 100644 --- a/obstools/scripts/atacr_correct_event.py +++ b/obstools/scripts/atacr_correct_event.py @@ -25,26 +25,21 @@ # Import modules and functions import numpy as np -from obspy import UTCDateTime, Stream import pickle import stdb +import copy + +from obspy import UTCDateTime, Stream + from obstools.atacr import EventStream from obstools.atacr import utils, plotting -from pathlib import Path +from pathlib import Path from argparse import ArgumentParser from os.path import exists as exist -from numpy import nan def get_correct_arguments(argv=None): - """ - Get Options from :class:`~optparse.OptionParser` objects. - - Calling options for the script `obs_correct_event.py` that accompanies this - package. - - """ parser = ArgumentParser( usage="%(prog)s [options] ", @@ -53,9 +48,9 @@ def get_correct_arguments(argv=None): "components, and use them to clean vertical " + "component of OBS data for selected events. The " + "noise data can be those obtained from the daily " + - "spectra (i.e., from `obs_daily_spectra.py`) " + "spectra (i.e., from `atacr_daily_spectra`) " "or those obtained from the averaged noise spectra " + - "(i.e., from `obs_clean_spectra.py`). Flags are " + + "(i.e., from `atacr_clean_spectra`). Flags are " + "available to specify the source of data to use as " + "well as the time range for given events. " "The stations are processed one by one and the " + @@ -229,6 +224,18 @@ def get_correct_arguments(argv=None): def main(args=None): + print() + print("##################################################################") + print("# _ _ #") + print("# ___ ___ _ __ _ __ ___ ___| |_ _____ _____ _ __ | |_ #") + print("# / __/ _ \| '__| '__/ _ \/ __| __| / _ \ \ / / _ \ '_ \| __| #") + print("# | (_| (_) | | | | | __/ (__| |_ | __/\ V / __/ | | | |_ #") + print("# \___\___/|_| |_| \___|\___|\__|___\___| \_/ \___|_| |_|\__| #") + print("# |_____| #") + print("# #") + print("##################################################################") + print() + if args is None: # Run Input Parser args = get_correct_arguments() @@ -297,13 +304,12 @@ def main(args=None): continue # Temporary print locations - tlocs = sta.location + tlocs = copy.copy(sta.location) if len(tlocs) == 0: tlocs = [''] for il in range(0, len(tlocs)): if len(tlocs[il]) == 0: - tlocs[il] = "--" - sta.location = tlocs + tlocs.append("--") # Update Display print(" ") diff --git a/obstools/scripts/atacr_daily_spectra.py b/obstools/scripts/atacr_daily_spectra.py index 2e7b6f4..e6e37cf 100644 --- a/obstools/scripts/atacr_daily_spectra.py +++ b/obstools/scripts/atacr_daily_spectra.py @@ -24,27 +24,21 @@ # Import modules and functions -import os import numpy as np import pickle import stdb +import copy + +from obspy import UTCDateTime + from obstools.atacr import utils, DayNoise -from pathlib import Path +from pathlib import Path from argparse import ArgumentParser from os.path import exists as exist -from obspy import UTCDateTime -from numpy import nan def get_dailyspec_arguments(argv=None): - """ - Get Options from :class:`~optparse.OptionParser` objects. - - Calling options for the script `obs_daily_spectra.py` that accompany this - package. - - """ parser = ArgumentParser( usage="%(prog)s [options] ", @@ -143,7 +137,7 @@ def get_dailyspec_arguments(argv=None): "in any given day to continue with analysis. " + "[Default 10]") ConstGroup.add_argument( - "--freq-band", + "--flag-freqs", action="store", type=str, dest="pd", @@ -152,6 +146,15 @@ def get_dailyspec_arguments(argv=None): "(float, in Hz) over which to calculate spectral " + "features used in flagging the bad windows. " + "[Default 0.004,2.0]") + ConstGroup.add_argument( + "--tilt-freqs", + action="store", + type=str, + dest="tf", + default=None, + help="Specify comma-separated frequency limits " + + "(float, in Hz) over which to calculate tilt. " + + "[Default 0.005,0.035]") ConstGroup.add_argument( "--tolerance", action="store", @@ -279,7 +282,18 @@ def get_dailyspec_arguments(argv=None): args.pd = sorted(args.pd) if (len(args.pd)) != 2: raise(Exception( - "Error: --freq-band should contain 2 " + + "Error: --flag-freqs should contain 2 " + + "comma-separated floats")) + + # Check input frequency band + if args.tf is None: + args.tf = [0.005, 0.035] + else: + args.tf = [float(val) for val in args.tf.split(',')] + args.tf = sorted(args.tf) + if (len(args.pd)) != 2: + raise(Exception( + "Error: --tilt-freqs should contain 2 " + "comma-separated floats")) return args @@ -287,6 +301,18 @@ def get_dailyspec_arguments(argv=None): def main(args=None): + print() + print("#################################################################") + print("# _ _ _ _ #") + print("# __| | __ _(_) |_ _ ___ _ __ ___ ___| |_ _ __ __ _ #") + print("# / _` |/ _` | | | | | | / __| '_ \ / _ \/ __| __| '__/ _` | #") + print("# | (_| | (_| | | | |_| | \__ \ |_) | __/ (__| |_| | | (_| | #") + print("# \__,_|\__,_|_|_|\__, |___|___/ .__/ \___|\___|\__|_| \__,_| #") + print("# |___/_____| |_| #") + print("# #") + print("#################################################################") + print() + if args is None: # Run Input Parser args = get_dailyspec_arguments() @@ -355,13 +381,12 @@ def main(args=None): continue # Temporary print locations - tlocs = sta.location + tlocs = copy.copy(sta.location) if len(tlocs) == 0: tlocs = [''] for il in range(0, len(tlocs)): if len(tlocs[il]) == 0: - tlocs[il] = "--" - sta.location = tlocs + tlocs.append("--") # Update Display print("\n|===============================================|") @@ -433,6 +458,7 @@ def main(args=None): # Average spectra for good windows daynoise.average_daily_spectra( calc_rotation=args.calc_rotation, + tiltfreqs=args.tf, fig_average=args.fig_average, fig_coh_ph=args.fig_coh_ph, save=plotpath, form=args.form) diff --git a/obstools/scripts/atacr_download_data.py b/obstools/scripts/atacr_download_data.py index 1516c2b..6c6c734 100644 --- a/obstools/scripts/atacr_download_data.py +++ b/obstools/scripts/atacr_download_data.py @@ -25,33 +25,29 @@ # Import modules and functions import numpy as np -import os.path import pickle import stdb -from obspy.clients.fdsn import Client -from obspy import Stream, UTCDateTime +import copy + +from obspy.clients.fdsn import Client as FDSN_Client +from obspy.clients.filesystem.sds import Client as SDS_Client +from obspy import Stream, UTCDateTime, read_inventory + from obstools.atacr import utils -from pathlib import Path +from pathlib import Path from argparse import ArgumentParser from os.path import exists as exist -from numpy import nan +import os.path as osp def get_daylong_arguments(argv=None): - """ - Get Options from :class:`~optparse.OptionParser` objects. - - Calling options for the script `obs_download_data.py` that accompany this - package. - - """ parser = ArgumentParser( usage="%(prog)s [options] ", description="Script used " + "to download and pre-process up to four-component " + - "(H1, H2, Z and P), day-long seismograms to use in " + + "(1,2,Z,H), day-long seismograms to use in " + "noise corrections of vertical component of OBS data. " + "Data are requested from the internet using the client " + "services framework for a given date range. The stations " + @@ -75,26 +71,6 @@ def get_daylong_arguments(argv=None): "For instance, providing IU will match with all stations " + "in the IU network. " + "[Default processes all stations in the database]") - parser.add_argument( - "-C", "--channels", - action="store", - type=str, - dest="channels", - default="", - help="Specify a comma-separated list of channels for " + - "which to perform the transfer function analysis. " + - "Possible options are '12' (for horizontal channels 1 and 2) " + - "and/or 'P' (for pressure channel). Specifying '12' allows " + - "for tilt correction. Specifying 'P' allows for compliance " + - "correction. [Default '12,P' looks for both horizontal and " + - "pressure and allows for both tilt AND compliance corrections]") - parser.add_argument( - "--zcomp", - dest="zcomp", - type=str, - default="Z", - help="Specify the Vertical Component Channel Identifier. " + - "[Default Z].") parser.add_argument( "-O", "--overwrite", action="store_true", @@ -102,6 +78,13 @@ def get_daylong_arguments(argv=None): default=False, help="Force the overwriting of pre-existing data. " + "[Default False]") + parser.add_argument( + "-Z", "--zcomp", + dest="zcomp", + type=str, + default="Z", + help="Specify the Vertical Component Channel Identifier. " + + "[Default Z].") # Server Settings ServerGroup = parser.add_argument_group( @@ -143,31 +126,35 @@ def get_daylong_arguments(argv=None): "be provided in form of the PGP message as a string, or the filename " "of a local file with the PGP message in it. [Default None]") - """ - # Database Settings + # Use local data directory DataGroup = parser.add_argument_group( title="Local Data Settings", description="Settings associated with defining " + - "and using a local data base of pre-downloaded day-long SAC files.") + "and using a local data base of pre-downloaded " + + "day-long SAC or MSEED files.") DataGroup.add_argument( "--local-data", action="store", type=str, dest="localdata", default=None, - help="Specify a comma separated list of paths containing " + - "day-long sac files of data already downloaded. " + - "If data exists for a seismogram is already present on disk, " + - "it is selected preferentially over downloading " + - "the data using the Client interface") + help="Specify absolute path to a SeisComP Data Structure (SDS) " + + "archive containing day-long SAC or MSEED files" + + "(e.g., --local-data=/Home/username/Data/SDS). " + + "See https://www.seiscomp.de/seiscomp3/doc/applications/slarchive/SDS.html " + + "for details on the SDS format. If this option is used, it takes " + + "precedence over the --server settings.") DataGroup.add_argument( - "--no-data-zero", - action="store_true", - dest="ndval", - default=False, - help="Specify to force missing data to be set as zero, " + - "rather than default behaviour which sets to nan.") -""" + "--dtype", + action="store", + type=str, + dest="dtype", + default='MSEED', + help="Specify the data archive file type, either SAC " + + " or MSEED. Note the default behaviour is to search for " + + "SAC files. Local archive files must have extensions of " + + "'.SAC' or '.MSEED'. These are case dependent, so specify " + + "the correct case here.") # Constants Settings FreqGroup = parser.add_argument_group( @@ -233,15 +220,6 @@ def get_daylong_arguments(argv=None): if len(args.stkeys) > 0: args.stkeys = args.stkeys.split(',') - # create channel list - if len(args.channels) > 0: - args.channels = args.channels.split(',') - else: - args.channels = ["12", "P"] - for cha in args.channels: - if cha not in ["12", "P"]: - parser.error("Error: Channel not recognized " + str(cha)) - # construct start time if len(args.startT) > 0: try: @@ -279,17 +257,12 @@ def get_daylong_arguments(argv=None): else: args.userauth = [None, None] - # # Parse Local Data directories - # if args.localdata is not None: - # args.localdata = args.localdata.split(',') - # else: - # args.localdata = [] - - # # Check NoData Value - # if args.ndval: - # args.ndval = 0.0 - # else: - # args.ndval = nan + # Check Datatype specification + if args.dtype.upper() not in ['MSEED', 'SAC']: + parser.error( + "Error: Local Data Archive must be of types 'SAC'" + + "or MSEED. These must match the file extensions for " + + " the archived data.") if args.units not in ['DISP', 'VEL', 'ACC']: msg = ("Error: invalid --units argument. Choose among " + @@ -311,6 +284,18 @@ def get_daylong_arguments(argv=None): def main(args=None): + print() + print("###########################################################################") + print("# _ _ _ _ _ #") + print("# __| | _____ ___ __ | | ___ __ _ __| | __| | __ _| |_ __ _ #") + print("# / _` |/ _ \ \ /\ / / '_ \| |/ _ \ / _` |/ _` | / _` |/ _` | __/ _` | #") + print("# | (_| | (_) \ V V /| | | | | (_) | (_| | (_| | | (_| | (_| | || (_| | #") + print("# \__,_|\___/ \_/\_/ |_| |_|_|\___/ \__,_|\__,_|___\__,_|\__,_|\__\__,_| #") + print("# |_____| #") + print("# #") + print("###########################################################################") + print() + if args is None: # Run Input Parser args = get_daylong_arguments() @@ -346,15 +331,30 @@ def main(args=None): # Define path to see if it exists datapath = Path('DATA') / Path(stkey) if not datapath.is_dir(): - print('\nPath to '+str(datapath)+' doesn`t exist - creating it') + print("\nPath to "+str(datapath)+ + " doesn't exist - creating it") datapath.mkdir(parents=True) # Establish client - client = Client( - base_url=args.server, - user=args.userauth[0], - password=args.userauth[1], - eida_token=args.tokenfile) + inv = None + if args.localdata is None: + client = FDSN_Client( + base_url=args.server, + user=args.userauth[0], + password=args.userauth[1], + eida_token=args.tokenfile) + # Use local client for SDS + else: + client = SDS_Client( + args.localdata, + format=args.dtype) + # Try loading the station XML to remove response + xmlfile, ext = osp.splitext(args.indb) + try: + inv = read_inventory(xmlfile+".xml") + except Exception: + print("\nStation XML file " + xmlfile + + ".xml not found -> Cannot remove response") # Get catalogue search start time if args.startT is None: @@ -372,13 +372,14 @@ def main(args=None): continue # Temporary print locations - tlocs = sta.location + tlocs = copy.copy(sta.location) if len(tlocs) == 0: tlocs = [''] for il in range(0, len(tlocs)): if len(tlocs[il]) == 0: - tlocs[il] = "--" - sta.location = tlocs + tlocs.append("--") + if "--" in tlocs: + sta.location = [''] # Update Display print("\n|===============================================|") @@ -416,10 +417,9 @@ def main(args=None): tstamp = str(t1.year).zfill(4)+'.'+str(t1.julday).zfill(3)+'.' print("\n"+"*"*60) - print("* Downloading day-long data for key {0} " + - " and day {1}.{2:03d}".format(stkey, t1.year, t1.julday)) + print("* Downloading day-long data for key {0} and day {1}.{2:03d}".format( + stkey, t1.year, t1.julday)) print("*") - print("* Channels selected: "+str(args.channels)+' and vertical') # Define file names (to check if files already exist) # Horizontal 1 channel @@ -431,192 +431,145 @@ def main(args=None): # Pressure channel fileP = datapath / (tstamp+'.'+sta.channel[0]+'DH.SAC') - if "P" not in args.channels: - - # If data files exist, continue - if fileZ.exists() and file1.exists() and file2.exists(): - if not args.ovr: - print( - "* "+tstamp + - "*SAC ") - print( - "* -> Files already exist, " + - "continuing ") - t1 += dt - t2 += dt - continue - - channels = sta.channel.upper()+'1,'+sta.channel.upper() + \ - '2,'+sta.channel.upper()+args.zcomp - - # Get waveforms from client - try: - print("* "+tstamp + - "*SAC ") - print("* -> Downloading Seismic data... ") - sth = client.get_waveforms( - network=sta.network, - station=sta.station, - location=sta.location[0], - channel=channels, - starttime=t1, - endtime=t2, - attach_response=True) - print("* ...done") - - except Exception: - print(" Error: Unable to download ?H? components - " + - "continuing") + # If data files exist, continue + exist = file1.exists()+file2.exists()+fileZ.exists()+fileP.exists() + if exist > 0: + if not args.ovr: + print("* "+tstamp+"*.SAC") + print("* -> Files already exist. Continuing") t1 += dt t2 += dt continue - st = sth.merge() - - elif "12" not in args.channels: - - # If data files exist, continue - if fileZ.exists() and fileP.exists(): - if not args.ovr: - print("* "+tstamp + - "*SAC ") - print("* -> Files already exist, " + - "continuing ") - t1 += dt - t2 += dt - continue - - channels = sta.channel.upper() + args.zcomp - - # Get waveforms from client + print("* "+tstamp +"*.SAC") + # Get waveforms from client, one channel at a time + try: + cha = sta.channel.upper() + '1' + print("* -> Downloading "+cha+" data... ") + st1 = client.get_waveforms( + network=sta.network, + station=sta.station, + location=sta.location[0], + channel=cha, + starttime=t1, + endtime=t2, + attach_response=True) try: - print("* "+tstamp + - "*SAC ") - print("* -> Downloading Seismic data... ") - sth = client.get_waveforms( - network=sta.network, - station=sta.station, - location=sta.location[0], - channel=channels, - starttime=t1, - endtime=t2, - attach_response=True) + dum = st1.select(component='1')[0] print("* ...done") - except Exception: - print(" Error: Unable to download ?H? components - " + - "continuing") + print("* Warning: Component "+cha+" not found. Continuing") t1 += dt t2 += dt continue + + except Exception: + print(" Client exception: Unable to download "+cha+" component. "+ + "Continuing") + t1 += dt + t2 += dt + continue + + try: + cha = sta.channel.upper() + '2' + print("* -> Downloading "+cha+" data... ") + st2 = client.get_waveforms( + network=sta.network, + station=sta.station, + location=sta.location[0], + channel=cha, + starttime=t1, + endtime=t2, + attach_response=True) try: - print("* -> Downloading Pressure data...") - stp = client.get_waveforms( - network=sta.network, - station=sta.station, - location=sta.location[0], - channel='?DH', - starttime=t1, - endtime=t2, - attach_response=True) + dum = st2.select(component='2')[0] print("* ...done") - if len(stp) > 1: - print("WARNING: There are more than one ?DH trace") - print("* -> Keeping the highest sampling rate") - print( - "* -> Renaming channel to " + - sta.channel[0]+"DH") - if stp[0].stats.sampling_rate > \ - stp[1].stats.sampling_rate: - stp = Stream(traces=stp[0]) - else: - stp = Stream(traces=stp[1]) - except Exception: - print(" Error: Unable to download ?DH component - " + - "continuing") + print("* Warning: Component "+cha+" not found. Continuing") t1 += dt t2 += dt continue - st = sth.merge() + stp.merge() - - else: - - # If data files exist, continue - if (fileZ.exists() and file1.exists() and - file2.exists() and fileP.exists()): - if not args.ovr: - print("* "+tstamp + - "*SAC ") - print("* -> Files already exist, " + - "continuing ") - t1 += dt - t2 += dt - continue - - channels = sta.channel.upper()+'1,'+sta.channel.upper() + \ - '2,'+sta.channel.upper()+args.zcomp - - # Get waveforms from client + except Exception: + print(" Client exception: Unable to download "+cha+" component. "+ + "Continuing") + t1 += dt + t2 += dt + continue + + try: + cha = sta.channel.upper() + args.zcomp + print("* -> Downloading "+cha+" data... ") + stz = client.get_waveforms( + network=sta.network, + station=sta.station, + location=sta.location[0], + channel=cha, + starttime=t1, + endtime=t2, + attach_response=True) try: - print("* "+tstamp + - "*SAC ") - print("* -> Downloading Seismic data... ") - sth = client.get_waveforms( - network=sta.network, - station=sta.station, - location=sta.location[0], - channel=channels, - starttime=t1, - endtime=t2, - attach_response=True) + dum = stz.select(component=args.zcomp)[0] print("* ...done") - except Exception: - print(" Error: Unable to download ?H? components - " + - "continuing") + print("* Warning: Component "+cha+" not found. "+ + "Continuing") t1 += dt t2 += dt continue + + except Exception: + print(" Client exception: Unable to download "+cha+ + " component. Try setting `--zcomp`. Continuing.") + t1 += dt + t2 += dt + continue + + try: + print("* -> Downloading ?DH data...") + stp = client.get_waveforms( + network=sta.network, + station=sta.station, + location=sta.location[0], + channel='?DH', + starttime=t1, + endtime=t2, + attach_response=True) + if len(stp) > 1: + print("WARNING: There are more than one ?DH trace") + print("* -> Keeping the highest sampling rate") + print("* -> Renaming channel to "+sta.channel[0]+"DH") + if stp[0].stats.sampling_rate > \ + stp[1].stats.sampling_rate: + stp = Stream(traces=stp[0]) + else: + stp = Stream(traces=stp[1]) try: - print("* -> Downloading Pressure data...") - stp = client.get_waveforms( - network=sta.network, - station=sta.station, - location=sta.location[0], - channel='?DH', - starttime=t1, - endtime=t2, - attach_response=True) + dum = stp.select(component='H')[0] print("* ...done") - if len(stp) > 1: - print("WARNING: There are more than one ?DH trace") - print("* -> Keeping the highest sampling rate") - print( - "* -> Renaming channel to " + - sta.channel[0]+"DH") - if stp[0].stats.sampling_rate > \ - stp[1].stats.sampling_rate: - stp = Stream(traces=stp[0]) - else: - stp = Stream(traces=stp[1]) - except Exception: - print(" Error: Unable to download ?DH component - " + - "continuing") + print("* Warning: Component ?DH not found. Continuing") t1 += dt t2 += dt continue - st = sth.merge() + stp.merge() + except Exception: + print(" Client exception: Unable to download ?DH component. "+ + "Continuing") + t1 += dt + t2 += dt + continue + + st = st1.merge() + st2.merge() + stz.merge() + stp.merge() # Detrend, filter st.detrend('demean') st.detrend('linear') st.filter( - 'lowpass', freq=0.5*args.new_sampling_rate, - corners=2, zerophase=True) + 'lowpass', + freq=0.5*args.new_sampling_rate, + corners=2, + zerophase=True) st.resample(args.new_sampling_rate) # Check streams @@ -626,43 +579,78 @@ def main(args=None): t2 += dt continue - sth = st.select(component='1') + st.select(component='2') + \ - st.select(component=args.zcomp) + # Extracting seismic data components + sth = st.select(component='1') + \ + st.select(component='2') + \ + st.select(component=args.zcomp) # Remove responses print("* -> Removing responses - Seismic data") - sth.remove_response(pre_filt=args.pre_filt, output=args.units) + try: + sth.remove_response( + inventory=inv, + pre_filt=args.pre_filt, + output=args.units) + except Exception: + print("* -> Inventory not found: Cannot remove instrument "+ + "response") # Extract traces - Z trZ = sth.select(component=args.zcomp)[0] + trZ = utils.update_stats( - trZ, sta.latitude, sta.longitude, sta.elevation, + trZ, + sta.latitude, + sta.longitude, + sta.elevation, sta.channel+'Z') trZ.write(str(fileZ), format='SAC') # Extract traces - H - if "12" in args.channels: + try: tr1 = sth.select(component='1')[0] tr2 = sth.select(component='2')[0] tr1 = utils.update_stats( - tr1, sta.latitude, sta.longitude, sta.elevation, + tr1, + sta.latitude, + sta.longitude, + sta.elevation, sta.channel+'1') tr2 = utils.update_stats( - tr2, sta.latitude, sta.longitude, sta.elevation, + tr2, + sta.latitude, + sta.longitude, + sta.elevation, sta.channel+'2') tr1.write(str(file1), format='SAC') tr2.write(str(file2), format='SAC') + except Exception: + pass # Extract traces - P - if "P" in args.channels: + try: + stp = st.select(component='H') + print("* -> Removing responses - Pressure data") - stp.remove_response(pre_filt=args.pre_filt) + try: + stp.remove_response( + inventory=inv, + pre_filt=args.pre_filt, + output='DEF') + except Exception: + print("* -> Inventory not found: Cannot remove "+ + "instrument response") trP = stp[0] trP = utils.update_stats( - trP, sta.latitude, sta.longitude, sta.elevation, + trP, + sta.latitude, + sta.longitude, + sta.elevation, sta.channel[0]+'DH') trP.write(str(fileP), format='SAC') + except Exception: + pass t1 += dt t2 += dt diff --git a/obstools/scripts/atacr_download_event.py b/obstools/scripts/atacr_download_event.py index 48c87fc..ca4cecc 100644 --- a/obstools/scripts/atacr_download_event.py +++ b/obstools/scripts/atacr_download_event.py @@ -25,29 +25,25 @@ # Import modules and functions import numpy as np -import os.path import pickle import stdb -from obspy.clients.fdsn import Client +import copy + +from obspy.clients.fdsn import Client as FDSN_Client +from obspy.clients.filesystem.sds import Client as SDS_Client from obspy.geodetics.base import gps2dist_azimuth as epi from obspy.geodetics import kilometer2degrees as k2d -from obspy.core import Stream, UTCDateTime +from obspy import Stream, UTCDateTime, read_inventory + from obstools.atacr import utils, EventStream -from pathlib import Path +from pathlib import Path from argparse import ArgumentParser from os.path import exists as exist -from numpy import nan +import os.path as osp def get_event_arguments(argv=None): - """ - Get Options from :class:`~optparse.OptionParser` objects. - - Calling options for the script `obs_download_event.py` that - accompany this package. - - """ parser = ArgumentParser( usage="%(prog)s [options] ", @@ -79,18 +75,12 @@ def get_event_arguments(argv=None): "all stations in the IU network [Default processes " + "all stations in the database]") parser.add_argument( - "-C", "--channels", - action="store", - type=str, - dest="channels", - default="", - help="Specify a comma-separated list of channels for " + - "which to perform the transfer function analysis. " + - "Possible options are '12' (for horizontal channels) or 'P' " + - "(for pressure channel). Specifying '12' allows " + - "for tilt correction. Specifying 'P' allows for compliance " + - "correction. [Default '12,P' looks for both horizontal and " + - "pressure and allows for both tilt AND compliance corrections]") + "-O", "--overwrite", + action="store_true", + dest="ovr", + default=False, + help="Force the overwriting of pre-existing data. " + + "[Default False]") parser.add_argument( "--zcomp", dest="zcomp", @@ -98,13 +88,6 @@ def get_event_arguments(argv=None): default="Z", help="Specify the Vertical Component Channel Identifier. "+ "[Default Z].") - parser.add_argument( - "-O", "--overwrite", - action="store_true", - dest="ovr", - default=False, - help="Force the overwriting of pre-existing data. " + - "[Default False]") # Server Settings ServerGroup = parser.add_argument_group( @@ -131,8 +114,8 @@ def get_event_arguments(argv=None): dest="userauth", default=None, help="Authentification Username and Password for the " + - "waveform server (--user-auth='username:authpassword') to access " + - "and download restricted data. [Default no user and password]") + "waveform server (--user-auth='username:authpassword') to access " + + "and download restricted data. [Default no user and password]") ServerGroup.add_argument( "--eida-token", action="store", @@ -140,11 +123,41 @@ def get_event_arguments(argv=None): dest="tokenfile", default=None, help="Token for EIDA authentication mechanism, see " + - "http://geofon.gfz-potsdam.de/waveform/archive/auth/index.php. " - "If a token is provided, argument --user-auth will be ignored. " - "This mechanism is only available on select EIDA nodes. The token can " - "be provided in form of the PGP message as a string, or the filename of " - "a local file with the PGP message in it. [Default None]") + "http://geofon.gfz-potsdam.de/waveform/archive/auth/index.php. " + "If a token is provided, argument --user-auth will be ignored. " + "This mechanism is only available on select EIDA nodes. The token can " + "be provided in form of the PGP message as a string, or the filename of " + "a local file with the PGP message in it. [Default None]") + + # Use local data directory + DataGroup = parser.add_argument_group( + title="Local Data Settings", + description="Settings associated with defining " + + "and using a local data base of pre-downloaded " + + "day-long SAC or MSEED files.") + DataGroup.add_argument( + "--local-data", + action="store", + type=str, + dest="localdata", + default=None, + help="Specify absolute path to a SeisComP Data Structure (SDS) " + + "archive containing day-long SAC or MSEED files" + + "(e.g., --local-data=/Home/username/Data/SDS). " + + "See https://www.seiscomp.de/seiscomp3/doc/applications/slarchive/SDS.html " + + "for details on the SDS format. If this option is used, it takes " + + "precedence over the --server settings.") + DataGroup.add_argument( + "--dtype", + action="store", + type=str, + dest="dtype", + default='MSEED', + help="Specify the data archive file type, either SAC " + + " or MSEED. Note the default behaviour is to search for " + + "SAC files. Local archive files must have extensions of " + + "'.SAC' or '.MSEED'. These are case dependent, so specify " + + "the correct case here.") # Constants Settings FreqGroup = parser.add_argument_group( @@ -275,16 +288,6 @@ def get_event_arguments(argv=None): if len(args.stkeys) > 0: args.stkeys = args.stkeys.split(',') - # create channel list - if len(args.channels) > 0: - args.channels = args.channels.split(',') - else: - args.channels = ['12', 'P'] - - for cha in args.channels: - if cha not in ['12', 'P']: - parser.error("Error: Channel not recognized " + str(cha)) - # construct start time if len(args.startT) > 0: try: @@ -322,9 +325,16 @@ def get_event_arguments(argv=None): else: args.userauth = [None, None] + # Check Datatype specification + if args.dtype.upper() not in ['MSEED', 'SAC']: + parser.error( + "Error: Local Data Archive must be of types 'SAC'" + + "or MSEED. These must match the file extensions for " + + " the archived data.") + if args.units not in ['DISP', 'VEL', 'ACC']: msg = ("Error: invalid --units argument. Choose among " - "'DISP', 'VEL', or 'ACC'") + "'DISP', 'VEL', or 'ACC'") parser.error(msg) if args.pre_filt is None: @@ -334,7 +344,7 @@ def get_event_arguments(argv=None): args.pre_filt = sorted(args.pre_filt) if (len(args.pre_filt)) != 4: msg = ("Error: --pre-filt should contain 4 " - "comma-separated floats") + "comma-separated floats") parser.error(msg) return args @@ -342,6 +352,18 @@ def get_event_arguments(argv=None): def main(args=None): + print() + print("###############################################################################") + print("# _ _ _ _ #") + print("# __| | _____ ___ __ | | ___ __ _ __| | _____ _____ _ __ | |_ #") + print("# / _` |/ _ \ \ /\ / / '_ \| |/ _ \ / _` |/ _` | / _ \ \ / / _ \ '_ \| __| #") + print("# | (_| | (_) \ V V /| | | | | (_) | (_| | (_| | | __/\ V / __/ | | | |_ #") + print("# \__,_|\___/ \_/\_/ |_| |_|_|\___/ \__,_|\__,_|___\___| \_/ \___|_| |_|\__| #") + print("# |_____| #") + print("# #") + print("###############################################################################") + print() + if args is None: # Run Input Parser args = get_event_arguments() @@ -377,18 +399,33 @@ def main(args=None): # Define path to see if it exists eventpath = Path('EVENTS') / Path(stkey) if not eventpath.is_dir(): - print('\nPath to '+str(eventpath)+' doesn`t exist - creating it') + print("\nPath to "+str(eventpath)+ + " doesn't exist - creating it") eventpath.mkdir(parents=True) # Establish client - client = Client( - base_url=args.server, - user=args.userauth[0], - password=args.userauth[1], - eida_token=args.tokenfile) + inv = None + if args.localdata is None: + client = FDSN_Client( + base_url=args.server, + user=args.userauth[0], + password=args.userauth[1], + eida_token=args.tokenfile) + # Use local client for SDS + else: + client = SDS_Client( + args.localdata, + format=args.dtype) + # Try loading the station XML to remove response + xmlfile, ext = osp.splitext(args.indb) + try: + inv = read_inventory(xmlfile+".xml") + except Exception: + print("\nStation XML file " + xmlfile + + ".xml not found -> Cannot remove response") # Establish client for events - Default is 'IRIS'' - event_client = Client() + event_client = FDSN_Client() # Get catalogue search start time if args.startT is None: @@ -405,13 +442,14 @@ def main(args=None): continue # Temporary print locations - tlocs = sta.location + tlocs = copy.copy(sta.location) if len(tlocs) == 0: tlocs = [''] for il in range(0, len(tlocs)): if len(tlocs[il]) == 0: - tlocs[il] = "--" - sta.location = tlocs + tlocs.append("--") + if "--" in tlocs: + sta.location = [''] # Update Display print("\n|===============================================|") @@ -447,14 +485,15 @@ def main(args=None): # Get catalogue using deployment start and end cat = event_client.get_events( - starttime=tstart, endtime=tend, - minmagnitude=args.minmag, maxmagnitude=args.maxmag) + starttime=tstart, + endtime=tend, + minmagnitude=args.minmag, + maxmagnitude=args.maxmag) # Total number of events in Catalogue nevtT = len(cat) - print( - "| Found {0:5d}".format(nevtT) + - " possible events |") + print("| Found {0:5d}".format(nevtT) + + " possible events |") # Select order of processing ievs = range(0, nevtT) @@ -480,24 +519,18 @@ def main(args=None): # Display Event Info print("\n"+"*"*60) - print( - "* #({0:d}/{1:d}): {2:13s}".format( - inum+1, nevtT, time.strftime("%Y%m%d_%H%M%S"))) - print( - "* Origin Time: " + time.strftime("%Y-%m-%d %H:%M:%S")) - print( - "* Lat: {0:6.2f}; Lon: {1:7.2f}".format(lat, lon)) - print( - "* Dep: {0:6.2f}; Mag: {1:3.1f}".format(dep/1000., mag)) - print( - "* Dist: {0:7.2f} km; {1:7.2f} deg".format( - epi_dist, gac)) + print("* #({0:d}/{1:d}): {2:13s}".format( + inum+1, nevtT, time.strftime("%Y%m%d_%H%M%S"))) + print("* Origin Time: " + time.strftime("%Y-%m-%d %H:%M:%S")) + print("* Lat: {0:6.2f}; Lon: {1:7.2f}".format(lat, lon)) + print("* Dep: {0:6.2f}; Mag: {1:3.1f}".format(dep/1000., mag)) + print("* Dist: {0:7.2f} km; {1:7.2f} deg".format( + epi_dist, gac)) # If distance outside of distance range: if not (gac > args.mindist and gac < args.maxdist): - print( - "\n* -> Event outside epicentral distance " + - "range - continuing") + print("\n* -> Event outside epicentral distance " + + "range - continuing") continue t1 = time @@ -520,148 +553,127 @@ def main(args=None): # Pressure channel fileP = eventpath / (tstamp+'.'+sta.channel[0]+'DH.SAC') - print("\n* Channels selected: " + - str(args.channels)+' and vertical') - - # If data file exists, continue - if filename.exists(): + # If data files exist, continue + exist = file1.exists()+file2.exists()+fileZ.exists()+fileP.exists() + if exist > 0: if not args.ovr: - print("*") - print("* "+str(filename)) - print("* -> File already exists - continuing") + print("* "+tstamp+"*.SAC") + print("* -> Files already exist. Continuing") continue - if "P" not in args.channels: - - # Number of channels - ncomp = 3 - - # Comma-separated list of channels for Client - channels = sta.channel.upper() + '1,' + \ - sta.channel.upper() + '2,' + \ - sta.channel.upper() + args.zcomp - - # Get waveforms from client + print("* "+tstamp+"*.SAC") + # Get waveforms from client, one channel at a time + try: + cha = sta.channel.upper() + '1' + print("* -> Downloading "+cha+" data... ") + st1 = client.get_waveforms( + network=sta.network, + station=sta.station, + location=sta.location[0], + channel=cha, + starttime=t1, + endtime=t2, + attach_response=True) try: - print("* "+tstamp + - " ") - print("* -> Downloading Seismic data... ") - sth = client.get_waveforms( - network=sta.network, station=sta.station, - location=sta.location[0], channel=channels, - starttime=t1, endtime=t2, attach_response=True) + dum = st1.select(component='1')[0] print("* ...done") except Exception: - print( - " Error: Unable to download ?H? components - " + - "continuing") + print("* Warning: Component "+cha+" not found. Continuing") continue - st = sth - - elif "12" not in args.channels: - - # Number of channels - ncomp = 2 - - # Comma-separated list of channels for Client - channels = sta.channel.upper() + args.zcomp + except Exception: + print(" Client exception: Unable to download "+cha+" component. "+ + "Continuing") + continue - # Get waveforms from client + try: + cha = sta.channel.upper() + '2' + print("* -> Downloading "+cha+" data... ") + st2 = client.get_waveforms( + network=sta.network, + station=sta.station, + location=sta.location[0], + channel=cha, + starttime=t1, + endtime=t2, + attach_response=True) try: - print("* "+tstamp + - " ") - print("* -> Downloading Seismic data... ") - sth = client.get_waveforms( - network=sta.network, station=sta.station, - location=sta.location[0], channel=channels, - starttime=t1, endtime=t2, attach_response=True) + dum = st2.select(component='2')[0] print("* ...done") except Exception: - print( - " Error: Unable to download ?H? components - " + - "continuing") + print("* Warning: Component "+cha+" not found. Continuing") continue + + except Exception: + print(" Client exception: Unable to download "+cha+" component. "+ + "Continuing") + continue + + try: + cha = sta.channel.upper() + args.zcomp + print("* -> Downloading "+cha+" data... ") + stz = client.get_waveforms( + network=sta.network, + station=sta.station, + location=sta.location[0], + channel=cha, + starttime=t1, + endtime=t2, + attach_response=True) try: - print("* -> Downloading Pressure data...") - stp = client.get_waveforms( - network=sta.network, station=sta.station, - location=sta.location[0], channel='?DH', - starttime=t1, endtime=t2, attach_response=True) + dum = stz.select(component=args.zcomp)[0] print("* ...done") - if len(stp) > 1: - print("WARNING: There are more than one ?DH trace") - print("* -> Keeping the highest sampling rate") - print( - "* -> Renaming channel to " + - sta.channel[0] + "DH") - if stp[0].stats.sampling_rate > \ - stp[1].stats.sampling_rate: - stp = Stream(traces=stp[0]) - else: - stp = Stream(traces=stp[1]) except Exception: - print(" Error: Unable to download ?DH component - " + - "continuing") + print("* Warning: Component "+cha+" not found. "+ + "Continuing") continue - st = sth + stp - - else: - - # Comma-separated list of channels for Client - ncomp = 4 - - # Comma-separated list of channels for Client - channels = sta.channel.upper() + '1,' + \ - sta.channel.upper() + '2,' + \ - sta.channel.upper() + args.zcomp + except Exception: + print(" Client exception: Unable to download "+cha+ + " component. Try setting `--zcomp`. Continuing.") + continue - # Get waveforms from client + try: + print("* -> Downloading ?DH data...") + stp = client.get_waveforms( + network=sta.network, + station=sta.station, + location=sta.location[0], + channel='?DH', + starttime=t1, + endtime=t2, + attach_response=True) + if len(stp) > 1: + print("WARNING: There are more than one ?DH trace") + print("* -> Keeping the highest sampling rate") + print("* -> Renaming channel to "+sta.channel[0]+"DH") + if stp[0].stats.sampling_rate > \ + stp[1].stats.sampling_rate: + stp = Stream(traces=stp[0]) + else: + stp = Stream(traces=stp[1]) try: - print("* "+tstamp + - " ") - print("* -> Downloading Seismic data... ") - sth = client.get_waveforms( - network=sta.network, station=sta.station, - location=sta.location[0], channel=channels, - starttime=t1, endtime=t2, attach_response=True) + dum = stp.select(component='H')[0] print("* ...done") except Exception: - print( - " Error: Unable to download ?H? components - " + - "continuing") - continue - try: - print("* -> Downloading Pressure data...") - stp = client.get_waveforms( - network=sta.network, station=sta.station, - location=sta.location[0], channel='?DH', - starttime=t1, endtime=t2, attach_response=True) - print(" ...done") - if len(stp) > 1: - print("WARNING: There are more than one ?DH trace") - print("* -> Keeping the highest sampling rate") - print( - "* -> Renaming channel to " + - sta.channel[0] + "DH") - if stp[0].stats.sampling_rate > \ - stp[1].stats.sampling_rate: - stp = Stream(traces=stp[0]) - else: - stp = Stream(traces=stp[1]) - except Exception: - print(" Error: Unable to download ?DH component - " + - "continuing") + print("* Warning: Component ?DH not found. Continuing") continue - st = sth + stp + except Exception: + print(" Client exception: Unable to download ?DH component. "+ + "Continuing") + continue + + st = st1.merge() + st2.merge() + stz.merge() + stp.merge() # Detrend, filter st.detrend('demean') st.detrend('linear') - st.filter('lowpass', freq=0.5*args.new_sampling_rate, - corners=2, zerophase=True) + st.filter( + 'lowpass', + freq=0.5*args.new_sampling_rate, + corners=2, + zerophase=True) st.resample(args.new_sampling_rate) # Check streams @@ -674,42 +686,77 @@ def main(args=None): # Remove responses print("* -> Removing responses - Seismic data") - sth.remove_response(pre_filt=args.pre_filt, output=args.units) + try: + sth.remove_response( + inventory=inv, + pre_filt=args.pre_filt, + output=args.units) + except Exception: + print("* -> Inventory not found: Cannot remove instrument "+ + "response") # Extract traces - Z trZ = sth.select(component=args.zcomp)[0] trZ = utils.update_stats( - trZ, sta.latitude, sta.longitude, sta.elevation, - sta.channel+'Z', evla=lat, evlo=lon) + trZ, + sta.latitude, + sta.longitude, + sta.elevation, + sta.channel+'Z', + evla=lat, + evlo=lon) trZ.write(str(fileZ), format='SAC') # Extract traces and write out in SAC format # Seismic channels - if "12" in args.channels: + try: tr1 = sth.select(component='1')[0] tr2 = sth.select(component='2')[0] tr1 = utils.update_stats( - tr1, sta.latitude, sta.longitude, sta.elevation, - sta.channel+'1', evla=lat, evlo=lon) + tr1, + sta.latitude, + sta.longitude, + sta.elevation, + sta.channel+'1', + evla=lat, + evlo=lon) tr2 = utils.update_stats( - tr2, sta.latitude, sta.longitude, sta.elevation, - sta.channel+'2', evla=lat, evlo=lon) + tr2, + sta.latitude, + sta.longitude, + sta.elevation, + sta.channel+'2', + evla=lat, + evlo=lon) tr1.write(str(file1), format='SAC') tr2.write(str(file2), format='SAC') + except Exception: + pass # Pressure channel - if "P" in args.channels: + try: stp = st.select(component='H') print("* -> Removing responses - Pressure data") - stp.remove_response(pre_filt=args.pre_filt) + try: + stp.remove_response( + inventory=inv, + pre_filt=args.pre_filt, + output='DEF') + except Exception: + print("* -> Inventory not found: Cannot remove "+ + "instrument response") trP = stp[0] trP = utils.update_stats( - trP, sta.latitude, sta.longitude, sta.elevation, - sta.channel[0]+'DH', evla=lat, evlo=lon) + trP, + sta.latitude, + sta.longitude, + sta.elevation, + sta.channel[0]+'DH', + evla=lat, + evlo=lon) trP.write(str(fileP), format='SAC') - - else: - stp = Stream() + except Exception: + pass # # Write out EventStream object # eventstream = EventStream(sta, sth, stp) diff --git a/obstools/scripts/atacr_transfer_functions.py b/obstools/scripts/atacr_transfer_functions.py index c84128a..a0d90b1 100644 --- a/obstools/scripts/atacr_transfer_functions.py +++ b/obstools/scripts/atacr_transfer_functions.py @@ -25,26 +25,21 @@ # Import modules and functions import numpy as np -from obspy import UTCDateTime import pickle import stdb +import copy + +from obspy import UTCDateTime + from obstools.atacr import StaNoise, Power, Cross, Rotation, TFNoise from obstools.atacr import utils, plotting -from pathlib import Path +from pathlib import Path from argparse import ArgumentParser from os.path import exists as exist -from numpy import nan def get_transfer_arguments(argv=None): - """ - Get Options from :class:`~optparse.OptionParser` objects. - - Calling options for the script `obs_transfer_functions.py` that accompanies - this package. - - """ parser = ArgumentParser( usage="%(prog)s [options] ", @@ -53,9 +48,9 @@ def get_transfer_arguments(argv=None): "components, to be used in cleaning vertical " + "component of OBS data. The noise data can be " + "those obtained from the daily spectra (i.e., " + - "from `obs_daily_spectra.py`) or those obtained " + + "from `atacr_daily_spectra`) or those obtained " + "from the averaged noise spectra (i.e., from " + - "`obs_clean_spectra.py`). Flags are available " + + "`atacr_clean_spectra`). Flags are available " + "to specify the source of data to use as well as " + "the time range over which to calculate the " + "transfer functions. The stations are processed " + @@ -203,6 +198,18 @@ def get_transfer_arguments(argv=None): def main(args=None): + print() + print("#######################################################################################") + print("# _ __ __ _ _ #") + print("# | |_ _ __ __ _ _ __ ___ / _| ___ _ __ / _|_ _ _ __ ___| |_(_) ___ _ __ ___ #") + print("# | __| '__/ _` | '_ \/ __| |_ / _ \ '__|| |_| | | | '_ \ / __| __| |/ _ \| '_ \/ __| #") + print("# | |_| | | (_| | | | \__ \ _| __/ | | _| |_| | | | | (__| |_| | (_) | | | \__ \ #") + print("# \__|_| \__,_|_| |_|___/_| \___|_|___|_| \__,_|_| |_|\___|\__|_|\___/|_| |_|___/ #") + print("# |_____| #") + print("# #") + print("#######################################################################################") + print() + if args is None: # Run Input Parser args = get_transfer_arguments() @@ -284,13 +291,12 @@ def main(args=None): continue # Temporary print locations - tlocs = sta.location + tlocs = copy.copy(sta.location) if len(tlocs) == 0: tlocs = [''] for il in range(0, len(tlocs)): if len(tlocs[il]) == 0: - tlocs[il] = "--" - sta.location = tlocs + tlocs.append("--") # Update Display print(" ") diff --git a/obstools/scripts/comply_calculate.py b/obstools/scripts/comply_calculate.py index 1034667..f586438 100644 --- a/obstools/scripts/comply_calculate.py +++ b/obstools/scripts/comply_calculate.py @@ -26,25 +26,21 @@ # -*- coding: utf-8 -*- # Import modules and functions import numpy as np -from obspy import UTCDateTime import pickle import stdb +import copy + +from obspy import UTCDateTime + from obstools.atacr import Rotation, plotting from obstools.comply import Comply + from pathlib import Path from argparse import ArgumentParser from os.path import exists as exist -from numpy import nan def get_comply_arguments(argv=None): - """ - Get Options from :class:`~optparse.OptionParser` objects. - - Calling options for the script `obs_transfer functions.py` that accompanies - this package. - - """ parser = ArgumentParser( usage="%(prog)s [options] ", @@ -52,9 +48,9 @@ def get_comply_arguments(argv=None): "to calculate compliance functions between various " + "components. The noise data can be " + "those obtained from the daily spectra (i.e., " + - "from `atacr_daily_spectra.py`) or those obtained " + + "from `atacr_daily_spectra`) or those obtained " + "from the averaged noise spectra (i.e., from " + - "`atacr_clean_spectra.py`). Flags are available " + + "`atacr_clean_spectra`). Flags are available " + "to specify the source of data to use as well as " + "the time range over which to calculate the " + "transfer functions. The stations are processed " + @@ -222,6 +218,18 @@ def get_comply_arguments(argv=None): def main(args=None): + print() + print("#################################################################################") + print("# _ _ _ _ #") + print("# ___ ___ _ __ ___ _ __ | |_ _ ___ __ _| | ___ _ _| | __ _| |_ ___ #") + print("# / __/ _ \| '_ ` _ \| '_ \| | | | | / __/ _` | |/ __| | | | |/ _` | __/ _ \ #") + print("# | (_| (_) | | | | | | |_) | | |_| | | (_| (_| | | (__| |_| | | (_| | || __/ #") + print("# \___\___/|_| |_| |_| .__/|_|\__, |___\___\__,_|_|\___|\__,_|_|\__,_|\__\___| #") + print("# |_| |___/_____| #") + print("# #") + print("#################################################################################") + print() + if args is None: # Run Input Parser args = get_comply_arguments() @@ -303,13 +311,12 @@ def main(args=None): continue # Temporary print locations - tlocs = sta.location + tlocs = copy.copy(sta.location) if len(tlocs) == 0: tlocs = [''] for il in range(0, len(tlocs)): if len(tlocs[il]) == 0: - tlocs[il] = "--" - sta.location = tlocs + tlocs.append("--") # Update Display print(" ") diff --git a/pyproject.toml b/pyproject.toml index e254af4..c926779 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -5,7 +5,7 @@ requires = ["meson-python>0.15.0", "numpy >= 1.25.0"] [project] name = "obstools" -version = "0.2.0" +version = "0.2.1" description = "Python tools for ocean bottom seismic instruments" authors = [ { name = "Pascal Audet", email = "pascal.audet@uottawa.ca" }