Seismic Signal Processing

Special Thanks to Zhen Wang

Seismic signal processing, as a subfield of digital signal processing (DSP), focuses on the processing of seismic data to suppress noise, enhance signals and migrate seismic events to the appropriate location in the subsurface. Processing steps typically include deconvolution, velocity analysis, normal/dip moveout, static correction, stacking and migration. Seismic processing helps geologists implement better interpretation by providing more apparent and accurate subsurface structure.
This research is conducted within the Center for Energy and Geo Processing (CeGP), which is co-located with CSIP and directed by a CSIP faculty member, Prof. Ghassan AlRegib.
Seismic signal processing research falls within the following major areas:

  • 1. Microseismic Data Processing:
    application includes reservoir fluid movement monitoring, hydraulic fracture monitoring, and the prediction and analysis of earthquake. Sensor arrays are placed to record seismic data, and the data are processed to detect and localize microseismic events that are buried in the noisy recordings. Typical microseismic processing techniques include signal-enhancing filters, signal detection, and array signal processing.
  • 2. Reservoir characterization and related interpretation:
    It has become more and more urgent for researchers to find new oil fields influenced by the energy crisis nowadays. Developing an automated seismic interpretation system helps locate and describe the reservoir regions in large-scale seismic data ( >100 Terabytes). The computer based recognition system includes 1) features selection, such as physical and geometrical attributes, and 2) feature processing, which involves data mining, machine learning and digital image processing.
  • 3. Gathering and Compression of Seismic Data:
    Seismic data acquired from a typical field contains all information of the subsurface. Its size is extremely large for storage and transmission. Compressive sensing, matrix theory and some optimization methods are being applied to compress these large datasets.