This repository contains all data products, metadata, and codes necessary to reproduce all figures of the manuscript "Source effects in higher-order ambient seismic field correlations" by Schippkus et al., in review.
Note
Some of the notebooks will not run on laptops or personal PCs due to memory limitations. We ran the analysis on a server with 64 CPU threads and 512 GB of RAM.
data/: Folder to hold data and simulations. For instructions to download and generate the data, see below.figures/: All manuscript figures, generated by the Jupyter notebooks innotebooks/.meta/: Station metadata and matplotlib style file.notebooks/: Jupyter notebooks that implement all processing and generate the manuscript figures.
The folder data/ is empty at the start. All field data correlation functions have to be downloaded and simulations generated. With the same settings as used in the manuscript and notebooks, this will in total require ~70GB of disk space. We do not provide raw field data, but only the correlation functions necessary for reproduction of our results.
Important
Field data
Download the files and save them in the data/ directory.
The new files are
-
correlations_for_c1_data.pt:$C_1$ cross-correlations of all 1990 receiver stations with the master station in the center. Saved as atorch.tensorwith shape[1990, 3001]. Sampling rate 5 Hz, 300 seconds of anti-causal and causal lapse time included. First dimension (the receiver stations) is sorted alphabetically by station name. Required for comparison of$C_1$ and$C_2$ wavefields. -
correlations_for_c2_data.pt:$C_1$ cross-correlations of all 1990 receiver stations, including the master station, with the 304 auxiliary stations surrounding them. Saved as atorch.tensorwith shape[1990, 305, 3001]. Sampling rate 5 Hz, 300 seconds of anti-causal and causal lapse time included. First dimension (the receiver stations) and second dimension (the auxiliary stations) are sorted alphabetically by station name. The basis for computing$C_2$ correlations.
These correlations are computed as described in the manuscript: ~4 weeks of continuous recordings are cut into 1-hr windows and spectrally whitened. All windows are cross-correlated and linearly stacked. No additional processing.
Run the notebook compute_correlations.ipynb in notebooks/ with the parameter synthetic = True in the second cell to generate both the simulated
Run it three times for the different source_mode settings (source_mode="both", source_mode="boundary", source_mode="isolated") to produce all sets of correlation functions used in the manuscript.
After downloading the data, run the notebook compute_correlations.ipynb in notebooks/ with the parameter synthetic = False to compute the
See
The pyproject.toml file lists all packages required to run all notebooks. Follow your favourite installation procedure via uv, pip, or conda.
