Skip to main content

Long-life data

Oilfield Technology,


Manuel Terranova, Peaxy, USA, explains how the concept of data longevity can be used to great effect in the oil and gas industry.

Most oil exploration, drilling and production companies acknowledge that data management is one of today’s greatest challenges. One might perceive the inability to get a handle on mission-critical datasets as an IT problem, but the truth is that failure to access this invaluable information prevents organisations from accomplishing their highest-level strategic objectives. If engineering teams could easily find precious datasets created decades ago and compare them to the massive amounts of vital information being generated today, oil exploration outfits would realise tremendous operational gains that would translate into vast revenue increases. For example, if seismic teams could see snapshots of the first and last year of a 20-year-old reservoir side by side, companies could save thousands of dollars in drilling efficiencies. If R&D professionals had original geometry drawings and simulations under their thumbs during the manufacture and design of a piece of equipment, they could significantly reduce its time-to-market – it has been shown that a six-month delay reduces total profits by one-third over the life of a product. Once in the field, access to these files and associated telemetry datasets could help extend the life of that asset.


The ‘Dark Data Effect’ that results from the IT refresh cycle prevents engineering teams from closing the innovation loop in the product lifecycle.

Data silos: obstacle to revenue creation
Data management approaches of the last several decades have left a trail of disparate and inaccessible data stores that are, for all practical purposes, impossible to navigate by those who need them to make decisions. Very few effective strategies exist today for unifying and comparing datasets separated by time and space, which is surprising since the inability to access this data threatens the efficacy of resource extraction, and equipment manufacture, design and maintenance.

Efficient hydrocarbon extraction relies on an exploration methodology that provides an accurate, accessible understanding of resource deposits that brings engineers and scientists as close to geologic realities as possible. This process increasingly relies on collecting and managing surveys that are hundreds of terabytes in size or larger. Managing data from current seismic and other exploration activities is further complicated by the need to compare historical observations – captured in similarly mammoth datasets years ago – with recent data.

This temporal element is in play throughout the entire product lifecycle of machines such as turbines, drills and compressors. Researchers need test bench telemetry data, simulations and geometry drawings to validate equipment’s field readiness – files that are often created within five years of each other. To monitor and upgrade these machines years later, this information must be compared to huge telemetry datasets collected in the field over the decades-long lifespan of the equipment.

Storage oriented or access oriented?
Although the purpose of storing files is to generate value from them at a later time, organisations have lost track of older datasets in large part because they disproportionately focus on optimising storage space. They give less thought to the eventual need to access and reuse it. Moreover, storage architectures that have evolved since the 1980s make it difficult to keep tabs on and retrieve these datasets 15, 30 or more years later – an issue for oil and gas scientists because these files are still useful, even mission-critical, over this extended time span.

Part of the problem is that many companies view data storage solely as an IT concern. However, the value of data content and access suggests that data management is a crucial business and engineering issue as well. IT professionals are not charged with turning seismic data into value. It is the scientists, engineers and analysts who will transform it into actionable conclusions about the state of reserves, drilling approaches, product design, maintenance and so on. The oil and gas industry needs a new paradigm in which the IT infrastructure does not inhibit R&D professionals having required data at their fingertips.

Shielding scientists from hardware refreshes
Currently, the needs of data users come into direct conflict with the ‘hardware refresh cycle,’ processes for maintaining and updating the enterprise hardware/software infrastructure. Hardware updates happen according to a predictable schedule, about every three to five years. However, the rolling nature of the schedule means that most Fortune 1000 data centres are upgrading 10 to 15 percent of their hardware infrastructure on an ongoing basis. Data locations (file names) are shifted in these upgrades and links are broken, forcing end users – researchers, engineers, scientists – to rely on some level of ‘tribal knowledge’ to keep track of everything. In the short run, scientists and engineers face a forest of mount points; each individual may be able to navigate a few of them but will rely on colleagues to chart the rest. As employees leave the organisation and hardware refresh cycles continue, tribal knowledge will degrade, ultimately leaving data stranded in locations known by no one.


Engineers could extend the life of turbines if they had immediate access to associated geometry, simulation and telemetry datasets.

This ‘dark data’ represents massive cost for all organisations, but the consequences are particularly severe in the oil and gas industry. The only options for dealing with lost data are to initiate expensive expeditions to recreate or re-collect information, or to go without the insights that would have been generated from that data. Because it cannot be regenerated, consequences are particularly dire when historical data goes dark.

Where are the datasets from seismic reflections conducted in the 1980s and early 1990s, or original drawings and non-destructive tests of a pump that has been in the field for 25 years? When companies need to conduct reservoir survey or product comparisons over decades they often find that the biggest headaches come not from issues of research design, analytics or cost, but from structural barriers to data access. Even when dataset locations are known, these impediments make it difficult to compare datasets side-by-side, resulting in inefficiency and waste.

New IT paradigms for data access
In defence of the IT department, this data silo problem is in large part the result of storage and bandwidth limitations of previous generations of computing. In the past, servers and other means of storage were bulkier and costlier, while network speeds from even seven years ago paled in comparison to today. Now companies can buy significantly more storage space for less money, and increased network speeds enable the transfer of files through the network faster than a local drive ever could. How can oil and gas entities take advantage of this new computing landscape to create an architecture that allows them to get data from point A to B quickly and easily?

The approach taken by Peaxy® works to balance the need for storage and end user data access through the use of ‘data abstraction,’ which separates datasets from the underlying physical hardware. Current bandwidth increases allow operators to ‘spread the load,’ so to speak, by placing thousands of virtual machines throughout the enterprise to help eliminate bottlenecks in retrieving and aggregating files. The practical impact of this arrangement is that pathnames to files are preserved indefinitely in Peaxy’s Hyperfiler®, so the data can be retrieved at a later point, even decades from now, using the same pathname. Scientists will be able to find datasets days, weeks or years from now as easily as they can today. Changes in hardware due to IT refresh cycles will not affect access to data. R&D teams will no longer need to rely on tribal knowledge to find these mission-critical datasets – they will be available through the Hyperfiler’s simple, highly intuitive interface.

Data longevity enables transformational change
Immediate access to mission-critical datasets over decades will impact the oil and gas industry profoundly. Consider the following possibilities in this brave new world:

  • Basin-scale seismic studies. With the ability to compare surveys taken 10 or 20 years apart, companies can understand depletion rates and ingress levels, which helps prolong the reservoir’s life. Basin-scale capability is considered a game changer in the industry.
  • Field extension projects. To extend a reservoir, researchers need to examine flow characteristics, old geometry files and original simulation models – massive unstructured datasets that are currently siloed in most companies – to validate or re-evaluate assumptions made years ago against the real-life evolution of the reservoir.
  • Efficient hydraulic fracturing. Drillers can reduce the number of drill points in fracking operations by half if they can compare past and real time, current seismic studies, which would save significant time and money.
  • Next-level predictive maintenance. To prevent unplanned equipment outages, engineers need to aggregate original simulation files, test bench data and telemetry data collected in the field. This view could minimise unpredicted system failures for various kinds of deepwater equipment such as manifold trees and downhole pumps, which sometimes need to be in field for 40 years, or fracking machinery, which is sophisticated in terms of materials property, tolerance and pressure.
  • Remote asset management. If veteran technicians offshore and less-experienced personnel on a rig are working from the same geometry, simulation and telemetry data, the additional land-based expertise can help make sure machines are performing while in the field.

Although these use cases are operational in nature, they are highly important to the C-Suite. They represent enormous revenue and cost-saving opportunities in the short term and ensure long-run company survival. The company that provides instant retrieval and commingling of massive unstructured datasets over the span of decades will spearhead transformational change in the world of oil and gas.

Read the article online at: https://www.oilfieldtechnology.com/drilling-and-production/06032015/long-life-data/

You might also like

 
 

Embed article link: (copy the HTML code below):


 

This article has been tagged under the following:

Oil & gas news