Skip to content

A collection of codes to perform FWI with Devito

Notifications You must be signed in to change notification settings

DIG-Kaust/Devito-fwi

Repository files navigation

🚦 🚦 This library is under early development. Expect things to constantly change until version v1.0.0. 🚦 🚦

This library contains all the required building blocks to perform Full Waveform Inversion (FWI) with Devito (for modelling). Our vision is to provide an abstract template to perform FWI with all sort of pre- and post-processing hooks that users can develop and seamlessy include into a stable and robust FWI engine.

By doing so, we also aim integrate this library with PyLops and PyProximal to benefit from their modular handling of linear and proximal operators.

At its current stage of development, devito-fwi can be schematically represented as follows:

DevitoFWIschematic

Project structure

This repository is organized as follows:

  • 📂 devitofwi: python library containing routines to perform FWI with Devito modelling engines;
  • 📂 data: folder containing sample data to run notebooks;
  • 📂 notebooks: set of jupyter notebooks used to showcase different ways to run FWI with devitofwi;
  • 📂 scripts: set of scripts notebooks used to showcase different ways to run FWI with devitofwi in distributed mode;

Notebooks

The following notebooks are provided:

  • 📙 acoustic/Modelling_filtering.ipynb: notebook comparing modelling with a filtered wavelet and filtering of original data computed with unfiltered wavelet;
  • 📙 acoustic/Gradient_comparison.ipynb: notebook comparing gradients from different objective functions;
  • 📙 acoustic/Gradient_decoupled.ipynb: notebook comparing gradient computations with mixed computational graphs including a Torch Module (useful to embed NNs in FWI);
  • 📙 acoustic/Data_visualization.ipynb: notebook showcasing different visualization routines;
  • 📙 acoustic/Acoustic_RTM.ipynb: notebook performing acoustic RTM;
  • 📙 acoustic/AcousticVel_L2_1stage.ipynb: notebook performing acoustic FWI parametrized in velocity with entire data;
  • 📙 acoustic/AcousticVel_L2_1stagewrapper.ipynb: notebook performing acoustic FWI parametrized in velocity with entire data using AcousticFWI wrapper;
  • 📙 acoustic/AcousticVel_L2torch_1stage.ipynb: notebook performing acoustic FWI parametrized in velocity with entire data using Torch AD-based objective function;
  • 📙 acoustic/AcousticVel_L2adam_1stage.ipynb: notebook performing acoustic FWI parametrized in velocity with entire data wrapping wave engine in a Torch AD-based graph and using Adam solver;
  • 📙 acoustic/AcousticVel_L2refr_1stage.ipynb: notebook performing acoustic FWI parametrized in velocity with only refracted waves;
  • 📙 acoustic/AcousticVel_L2smooth_1stage.ipynb: notebook performing acoustic FWI parametrized in velocity with noisy data and Laplacian regularizer;
  • 📙 acoustic/AcousticVel_XCorrrefr_1stagewrapper.ipynb: notebook performing acoustic FWI parametrized in velocity with only refracted waves using XCorr objective function and AcousticFWI wrapper;
  • 📙 acoustic/AcousticVel_L2_Nstages.ipynb: notebook performing acoustic FWI parametrized in velocity with entire data in N frequency stages;
  • 📙 acoustic/AcousticVel_L2_Nstageswrapper.ipynb: notebook performing acoustic FWI parametrized in velocity with entire data in N frequency stages using AcousticFWI wrapper;
  • 📙 acoustic/AcousticVel_L2refr_Nstages.ipynb: notebook performing acoustic FWI parametrized in velocity with only refracted waves in N frequency stages.

Scripts

The following scripts are also provided:

  • 📙 acoustic/Gradient_comparison_mpi.ipynb: notebook computing gradients distributed across sources;
  • 📙 acoustic/AcousticVel_L2_1stage.ipynb: notebook performing acoustic FWI parametrized in velocity distributed across sources;
  • 📙 acoustic/AcousticVel_L2graddescent_1stage.py: notebook performing acoustic FWI parametrized in velocity distributed across sources with gradient descent algorithm;
  • 📙 acoustic/AcousticVel_L2PDTV_1stage.ipynb: notebook performing acoustic FWI parametrized in velocity distributed across source with primal-dual algorithm and isotropic total-variation regularization.

Getting started 👾 🤖

Serial version

To ensure reproducibility of the results, we suggest using the environment.yml file when creating an environment.

Simply run:

./install_env.sh

It will take some time, if at the end you see the word Done! on your terminal you are ready to go. After that you can simply install your package:

pip install .

or in developer mode:

pip install -e .

Remember to always activate the environment by typing:

conda activate devitofwi

MPI version

Follow the same procedure described above. However, run

./install_mpienv.sh

and remember to always activate the environment by typing:

conda activate devitofwi_mpi

To do list 📝

The following list is intended to define some of the improvements that should be implemented moving forward:

  • Optimize time step in N stages FWI to avoid using a too large time step when working at low-frequency (not chosen from the velocity model and frequency of the observed data but from that of each stage)
  • Create acousticfwi wrapper that can implement multi-stage acoustic FWI with time-space masking (and can specialize to any alternative case)
  • Improve handling of user-defined source wavelets
  • Create NonLinear operator template class to be used for both propagators and norms implementing basic operations such as sum, multiply, etc...
  • Switch velocity units to always be km/s to avoid continously rescaling them inside the FWI codes