Automation and Machine Learning

Artificial Intelligence (AI) is nothing new. The true challenge for AI is to solve problems that humans solve intuitively; problems that are usually hard to describe in a formal way, where it is therefore hard to give instructions on how they should be tackled. Machine learning (ML) methods provide a wide variety of solutions that can be broadly classified as supervised, unsupervised and reinforcement ML.

Supervised ML relies upon a relevant training dataset of known question-answer outcomes (‘labeled data’), unsupervised ML initially uses no labeled training data and relies upon the existence of natural patterns in the data, and reinforcement ML pursues an iterative trial and reinforcement process to build the solution. In recent years, significant progress has been made in all flavors of ML as technology has developed and vast amounts of data for testing have become available. PGS is not alone in exploring the fields of seismic acquisition, processing, reservoir characterization and interpretation for worthwhile applications of AI and ML.

Two examples of ML solutions briefly considered below are more automated coherent noise attenuation and more automated velocity model building—two of the largest bottlenecks in seismic processing flows. Both ambitions benefit from the huge redundancy of information acquired in modern 3D seismic surveys.

Automating Velocity Model Building

Although deep neural networks (DNNs) are a well-known methodology for training a computer to learn and apply a set of rules (for example, implement some kind of image recognition principle for interpreting certain data features), more general reinforcement methods such as Monte Carlo simulations still have great potential for manipulating data to achieve some desired outcome with minimal human intervention.

For example, a simple starting velocity model can be used to automate a sophisticated tomographic update of a large survey volume in only a few days—a process that historically takes weeks to months using manual iterative updates. First, automated analysis is performed to understand the local statistical characteristics of the dataset, and then an appropriate population of random perturbations is generated and applied to the starting velocity model. All the velocity models in the population are then passed through a highly efficient tomographic inversion engine (PGS hyperTomo). After a number of cycles a usable velocity model is generated with the convergence criterion being flat common image gathers (CIGs).

Following proof-of-concept studies in 2019, the PGS hyperModel methodology was recently applied to a large 3D volume (3 500 sq. km) that straddles shallow and deepwater environments from a survey in West Africa. Final results are remarkably accurate, as well as insensitive to large errors in the starting model (see the figure below). PGS is also exploring ways to automate Full Waveform Inversion (FWI) that benefit from recent developments in how FWI models data reflectivity: Latest developments will be presented at the EAGE conference in June.

(A) Depth velocity model estimated using traditional multi-stage reflection tomography; (B) Final automated model using a simple starting model; and (C) Final automated model using a highly incorrect starting model. The blue arrows identify a channel feature that was accommodated in the traditional tomographic approach (orange arrow in A) using manual insertion into the model, but were not recovered using the automated methodology. The results in (B) and (C) are otherwise impressive.

Depth velocity model A is estimated using traditional multi-stage reflection tomography. Final automated model B is derived using a simple starting model. Final automated model C is produced using a highly incorrect starting model. The blue arrows identify a channel feature that was accommodated in the traditional tomographic approach (orange arrow in A) using manual insertion into the model but was not recovered using the automated methodology. The results in (B) and (C) are otherwise impressive.

 

Noise Removal

Noise occurs in many forms throughout a processing flow. Most initial seismic processing attempts to remove noise so the data can be migrated with optimum quality after the velocity model is derived. Nevertheless, migrated images are often contaminated by uncompensated migration swings (see the upper part of the figure below). This noise (suboptimal deconstructive interference of the migration response) is a result of uneven subsurface illumination due to limited data coverage and/or propagation through complex media. Most of the time it is easy for humans to visually distinguish the artifacts in the images. We usually address this problem by designing filters that attenuate the noise, however, it is often difficult to create filters that remove the noise without damaging the image resolution. In the figure below a DNN was trained to attenuate coherent noise on seismic images. The application to a field dataset demonstrates the potential of the method to successfully attenuate migration swings without unacceptably compromising image resolution.

Slide the bar to compare migrated seismic data containing undesirable noise artifacts (left) and the outputs of a Machine Learning effort to attenuate much of the noise without compromising image resolution (right). 

 

 

What’s Next?

The implementation stage of ML solutions is typically fast and cheap, but the training stage requires large datasets to train the system. Indeed, the automated velocity modelbuilding (PGS hyperModel) example here depends upon highly efficient tools such as beam migration and its extension to reflection tomography (PGS hyperTomo): A lot of data needs to be interrogated and processed in the background for the computer to make robust decisions. PGS is actively exploring solutions to all our activities—from predictive maintenance of seismic survey vessels to realtime acquisition QC, to processing QC, the automation of human-intensive processes, and the classification and interpretation of data deliverables. This allows us to develop fundamentally different ways of thinking than the ways we were taught to use. Applications of automation, ML and AI will augment the ways we do things: Allowing us to make better decisions using more data in less time. The path to full automation will be a journey with our clients that should be very interesting but will require a lot of careful thought and cooperation.

 

Further Reading

  • Bekara, M., 2019, Automatic quality control of denoise processes using Support-Vector Machine Classifier. EAGE Extended Abstracts, Tu_R06_06
  • Klochikhina, E., Frolov, S., and Chemingui, N., 2020, Deep Learning for migration artifacts attenuation. Submitted to the EAGE conference
  • Long, A., 2019, Machine learning and related applications to seismic. Industry Insights, April
  • Long, A., and Martin, T., 2019, Automation of marine seismic data processing. The Leading Edge, accepted for publication in early-2020
  • Martin, T., and Bell, M., 2019, An innovative approach to automation for velocity model building. First Break, 37(6), 57-65
  • Yang Y., Ramos-Martinez, J., Whitmore, D., Valenciano, A.A., and Chemingui, N., 2020, Full Waveform Inversion using wave-equation reflectivity modeling. Submitted to the EAGE conference.