This example reproduce the siamese fwi (Omar et al., 2024), which use a network to map the data into a latent space and then compare them. The network is trained to minimize the difference between the latent space of the predicted and observed data.
For conventional FWI, the objective function is defined as the difference between the observed and synthetic data:
The siamese fwi, on the other hand, uses a network to map the data into a latent space, so we will have:
and
where
The objective function of siamese fwi is defined as the difference between the latent space of the predicted and observed data:
The configuration file is configure.py. You can change the parameters in this file to test the example.
First, we need to simulate the observed. The script forward.py is used to that.
python forward.pyScripts fwi_classic.py and fwi_siamese.py are used to run the conventional L2-based and Siamese-based FWI, respectively. You can either run it in an interactive window for checking the intermediate results or run it in the background.
I only have one 2080Ti, cannot test the full Marmousi model, so I use a downsampled for testing. This test still to be done.