A New Platform for Faster Seismic Data Processing

wave imagewave image
The December edition of First Break features a PGS article introducing a new data processing engine that can run on any system and promises orders of magnitude improvement in turnaround time. The engine maximizes data throughput and enables both full automation and interactivity of geophysical tasks.

As computer power is growing exponentially and seismic data processing volumes are increasing there is a need to explore ever more innovative ways of applying geophysical algorithms. The primary focus of automation in seismic processing is to remove human intervention using creative and pragmatic machine learning. The mass accumulation, sorting and processing of seismic data requires an efficient processing system, without which automation and timeline reduction will not be achievable.

Read the full paper 'Optimizing performance of data processing'.

The First Break paper describes a data processing system that is platform agnostic, and enables the optimal management of both data and algorithms. The system has an engine that will readily accommodate the rise and implementation of both pragmatic and advanced machine learning algorithms, and enable the near instantaneous processing of data.

An Efficient Engine for Any Seismic Data on Any System

For any processing stage of a seismic processing project, a number of workflows are used to apply geophysical algorithms as part of a sequence. The type and number of workflows used depends on the data, what it will be used for, and the contractual obligations. More often than not a sequence is a collection of data conditioning, editing and refining, requiring multiple data outputs.

To achieve a successful outcome using a conventional approach a geophysicist needs to interact with the data by modifying each workflow. The new automated engine optimizes the resource requirements, while minimizing the network load. As an example, a simple post-processing sequence of workflows including denoising, muting, stacking, filtering and whitening can be reduced to a single sequence with each individual task optimized, reducing the time taken by for the entire chained workflow by orders of magnitude.

The new platform conglomerates geophysical processes and seamlessly handles any data, regardless of its dimensionality or the system architecture on which it is being applied. Resources are optimized, and the user interaction with the data is minimized. Live project performance, as well as future project planning and automation, can be optimized using the statistical information produced by the system.

 

Contact a PGS expert

If you have questions related to our business please send us an email.