Editor's note: This is the second feature of a two-part series exploring the role of Big Data in the upstream industry. View Part 1 of the "Data that Deliver" series, featuring Schlumberger, Earth Science Analytics and Peloton here.
This article appears in the E&P newsletter. Subscribe to the E&P newsletter here.
With the world's largest oil and gas companies racing to lower operating costs and reduce emissions, Big Data continues to play a critical role in helping them stay profitable, meet increasingly stringent international environmental regulations and satisfy investors.
As the industry navigates to a low-carbon future, data experts from Baker Hughes, Nabors Drilling Solutions, Ikon Science and TrendMiner offered expert insights on how Big Data provides a lifeline to improved efficiency, better free cash flow profiles and higher ESG ratings.
Meet the experts:
- Scott Parent, CTO of Digital Solutions, Baker Hughes
- Subodh Saxena, senior vice president, Nabors Drilling Solutions
- Dr. Denis Saussus, CEO, Ikon Science
- Julian Pereira, head of customer success, TrendMiner
E&P: In the current environment of uncertainty as the industry recovers from the downturn, how can oil and gas companies benefit from Big Data?
Parent: The revolution in sensors, algorithms, analytics and cloud computing has not only greatly expanded data access, it also has moved the goal posts on what we can achieve with data. In fact, I would go as far as to say if your data isn’t feeding into software that’s advising you on how to improve your operations, you need to begin your organization’s digital transformation now.
Like so many other periods of challenge that we have experienced in the energy and industrial sectors, the current pandemic is fostering important advances in digital technology that will have significant benefits for oil and gas companies. When the right data is combined with the right systems, it can seamlessly connect people, tech and process and at a grander scale. It delivers greater operational safety and efficiency; it increases productivity and efficiency of assets to reduce operating costs and protect capital investment; and it can reduce carbon emissions.
Saxena: Oil and gas companies can benefit by harvesting their vast amounts of operational data to drive repeatable and consistent performance and reduce risk.
For years, the oil and gas industry has been a huge generator of data but not typically a big consumer of that data. This is largely because field best practices and learnings through repetitive tasks were the way we improved performance. Each individual rig often developed unique practices to perform safely and efficiently. Status quo, rule of thumbs and this decentralized decision-making were prevailed over standardization through automation because drilling operations are risky, making change management challenging.
Now Big Data is offering companies the opportunity to analyze rig-level information to glean insights that can be scaled fleetwide to drive repeatable and consistent performance.
Big Data also provides oil and gas companies an opportunity to better reduce risk. Math, modeling and database approaches have proven successful in other industries, such as insurance, to manage risk.
In comparison to the traditional business intelligence (BI) approach, Big Data will streamline data collection, increase the capabilities of the data availability and drive a demand-based scaling of data crunching. With these enhancements to aggregating subsurface, surface and equipment data, along with automation, we can consistently deliver optimal performance.
Saussus: With oil and gas companies under tremendous strain due to limited access to capital and human resources, leveraging Big Data is more important than ever to ensure long-term business viability. From real-time data streams to seismic, well logs, core data and more, there’s an incredibly large and exponentially growing amount of data that companies must have the capability to manage.
Technology that can properly integrate and leverage all of that data while making it accessible and usable to end users can enable companies to make more informed predictive decisions in the field, reducing cost overruns and improving profitability.
Pereira: Oil and gas companies are indeed in turmoil. Besides recovering from the downturn, they also are facing big changes due to increasing regulations and pressure for reducing carbon footprint, increasing sustainability toward net-zero operations.
Oil and gas companies have been gathering data for years, including production process-related data, which was mainly used for controlling operations.
Today with the digitalization waves rolling and new analytics technologies proofing their value, data is being used to reduce carbon footprint and emissions, while helping with energy transition.
Green energy sources are added to the mix to reduce carbon footprint, which has an impact on the production process.
With Big Data analytics in the hands of the operational experts, the situation before and after the energy transition can be analyzed to ensure a positive impact on the business outcome is made.
E&P: Can you give a few examples of technologies that your company is deploying for acquiring or managing data and how it helped optimize processes?
Parent: A great example of our technology using data is flare.IQ, which connects oil and gas operators in real time with flare stacks across their facilities. This sensor solution provides ultrasonic metering and advanced analytics to help operators calculate the optimum levels of combustion to minimize methane emissions. It is a technology currently being used by bp as part of its plan to be a net-zero company by 2050 or sooner, with the company using flare.IQ to help it better understand, measure and ultimately reduce methane emissions from its operations.
Data can also be used across operations to ensure the latest regulatory requirements are achieved. A good example of how we are deploying technology is Baker Hughes’ work for Petrobras, where we provide a suite of digital solutions across their sites. This includes Petrobras’ thermal plants, refineries, gas treatment units, production plants, offshore platforms and FPSO. This includes flare monitoring and calibration technologies, cybersecurity and remote monitoring services, and interconnected machinery protection systems and sensors.
Saxena: Nabors has been a pioneer in developing data acquisition and analysis tools for rigs, which is used to develop and deploy scalable drilling automation technologies that optimize operators' and drillers' well planning and execution through real-time data utilization.
Our SmartPLAN software digitalizes well and rig operations, enabling users to manage both the operator’s and drilling contractor’s workflows to continuously improve well manufacturing and delivery. In fact, all instructions needed to drill-to-plan are delivered to the driller’s HMI for simple quality execution from spud to spud.
Beyond the planning phase, Nabors integrated offering of automated drilling software drives performance for customers as the bit is turning to the right—we call this our Smart Suite. Within the Smart Suite, we have SmartNAV, our automated directional guidance system, which improves wellbore placement accuracy and reduces slide hours. SmartSLIDE is our directional steering control system, which automates slide drilling to reduce cycle time and optimize performance. And SmartDRILL, a full-stand automated drilling activity sequencer, executes best practices to improve BHA reliability and reduce unplanned trips.
Time and again, this integrated suite of offerings is driving performance and reducing costs for our customers. Additionally, process automation gives our drillers more time to focus on the safety of our crews.
Creating a continuous loop that connects all phases of the well manufacturing process is our RigCLOUD platform. The technology was purpose-built as an open platform for digital operations. Users can deploy apps at the rig and on the cloud to create workflows that connect field personnel with remote users on both web and mobile devices. Leveraging the analytics suite in RigCLOUD informs continuous improvement of the well program and operation, enabling our customers to make adjustments and execute them both in real time and on their next well through SmartPLAN.
Saussus: Siloed data leads to inconsistent and incomplete interpretation of that data, leading to lost opportunities, cost overruns and suboptimal decisions. Our knowledge management solution, Curate, integrates data from legacy databases and applications into a single workspace, allowing users instant access to historic and current subsurface data with streamlined, easy-to-use workflows enabling superior data democratization and business learnings that drive faster and more accurate decision-making.
One of the challenges with full-featured single-discipline software packages, which are often used for highly specialized subsurface workflows, is that they create siloed results that are then costly to reuse in decision-making. Such applications typically carry a high price tag, don’t scale and have a significant barrier to entry in terms of skillsets and training. Curate liberates data from silos and enables all end users to access and interpret data and knowledge on demand.
When you consider the cost to run a rig, for example, an offshore rig costs $250,000 per day. Reducing a half-day worth of time down to 5 minutes is significant.
In one example, we provided Curate to an operator that was looking for a solution to streamline their multi-well drilling campaign and enable its various stakeholders to collaborate on real-time monitoring decisions. Curate gave their team global access to all historical well data, all project-related data, plus live streams of active drilling data along with monitoring reports and live interpretations. Curate is able to integrate all these disparate types of data into one workspace instantly for more accurate, faster, safer decision-making.
Pereira: TrendMiner is a self-service industrial analytics solution that works directly with data residing in historians and other business applications holding operational contextual information, such as maintenance records and lab data. TrendMiner typically complements rather than detracts from the variety of production models normally in use. Not all process problems require data modeling. Time-series data holds a wealth of meaningful information, and TrendMiner exploits such value without the need for considerable expert use and effort to create, calibrate and maintain models.
Data-driven analysis in the process domain does not pose the same challenges, especially if, as in the case of TrendMiner, it has the proprietary patented pattern recognition capability to establish future occurrences of past events, allowing for automated alerts. This can offer complementary insights to that of any data models present.
E&P: What are the key challenges of implementing Big Data in the upstream industry?
Parent: Safe and efficient operations have always [been] required, capturing and keeping track of information on things like vibration, temperature and pressure. Today the revolution in sensors, algorithms, analytics and cloud computing has not only greatly expanded data access, it also has moved the goal posts on what we can achieve with data. However, we see five key challenges currently slowing down their deployment.
First, data can stall AI deployments if it is not available in high volume, is not accessible or siloed in various systems, or is of poor quality.
Second, digital projects often take time to develop and deploy, and interest and urgency can be lost to the grind of trying to make it all work.
Third, we are an industry stuck in proof-of-concept purgatory and need to be quicker at embracing the time to scale projects across an enterprise—to cover multiple fields of wells or an entire region of refineries.
Fourth, security and understanding the benefits of cybersecure infrastructure and data platforms, as only 30% of oil and gas companies have adopted cloud infrastructure today.
Fifth, knowledge, as our industry has a culture of defaulting to historical knowledge in order to intervene prior to potential machine failure or maintenance needs.
Saxena: The upstream industry faces functional, behavioral, recruitment and business model challenges of successfully implementing Big Data.
Data acquisition does not guarantee its completeness or quality will be useful. Each player (rig contractor, service providers, operator) is independently aggregating data and developing applications with their own subset of the data, which is very inefficient. Even simple tasks of time stamping and synchronization can escalate to be huge challenges. In addition, lack of interoperability standards requiring custom solutions make integration difficult. Finally, cybersecurity needs are continuously evolving, which need to be addressed. These functional challenges require us to have a balanced data strategy across the whole upstream value chain.
From a behavioral standpoint, the industry must change the way it’s been approaching data. Decentralized decision-making must be replaced by behaviors that drive performance at scale that incorporate local knowledge.
Building a workforce of the future with the talents and skillsets that match our evolving, increasingly digital industry is also critical. Recruiting, retaining and progressing the next wave of talent is becoming an imperative to win longer term.
Finally, the business model related to ownership, costs and value creation is an evolving exercise. Partnerships and collaboration are increasingly important to achieve success as the industry adopts Big Data.
Saussus: Big Data is actually quite well understood as a concept, but not necessarily completely understood in the upstream industry.
First, we have to take a step back and ask why the distinction between conventional data and Big Data? And why do we have different toolsets to manage and analyze these different scopes of data?
There seems to be a misplaced notion that Big Data and its associated tooling is somehow better and more advanced, resulting in a better understanding and more accurate predictions than that obtained with conventional tools. However, if this was the case, then we would have seen the complete replacement of conventional tools by Big Data tools a long time ago, and this is not the case.
Conventional data technologies and Big Data technologies both have their place within our industry, and it’s never a one size fits all. Therefore, the real challenge is understanding the data and the 4 Vs (volume, variety, veracity and velocity) and making the correct technology choices at the onset, be it a Big Data solution or a more conventional technology path.
Our knowledge management system, Curate, has the flexibility to not only store most subsurface data types but also connect to existing sources, from traditional databases and applications to unstructured systems and data lakes, allowing a single interface for the end users while making the correct architectural decisions depending on the data.
Pereira: The first challenge is the perception that the ‘right’ data is not available and large IT projects are needed before data can be used, such as creating a data lake and finding the right cloud solution to analyze data for data-driven decision-making. Almost always, oil and gas companies have a wealth of underutilized data due to the wrong tools to leverage that data, such as Excel or complex multidisciplinary data modeling projects.
It's about the mindset to really make use of the available data. Taking the time to leverage that data differently than in the past and use new technologies that make the subject matter experts much more efficient in analyzing the data. In that way, many more operational improvements can be made by more people, accelerating the energy transition.
E&P: Moving forward, what are some key opportunities for advancing data analytics in the upstream sector?
Parent: We see some key opportunities around augmented reality [AR], virtual reality [VR] and robotics to further enhance collaboration, minimize risks, improve safety and reduce costs.
The most significant areas to be impacted include:
- Remote design review: Geographically dispersed teams can use AR for collaborative design review and inspection.
- On-demand training: Current VR-based training applications can be converted to AR training applications to allow employees to upskill remotely.
- Immersive data visualization: AR will allow teams to remotely view and interact with spatial data, such as field inspection results or subsurface 3D reservoir models.
- Manual-process automation: Using robotics, computer vision and machine learning, Avitas, a Baker Hughes venture, can automate and speed time-consuming manual processes, such as well pad inspections, for example, by using the LUMEN drone-based spot monitoring and ground-based continuous monitoring for leaks, corrosion and more.
- Additive manufacturing and 3D printing: The additive manufacturing generative design program can suggest a better design based on the AI information provided from the field. The improved design can then be sent to a local 3D printer closer to the point of consumption—as opposed to shipping mass amounts of parts across the world—saving customers time and money.
Saxena: Many solutions are coming up for cloud-based data aggregation and analytics. However, simple estimates show that less than 5% of the data acquired on the rig is brought into the cloud. Moving some of the analytics to the edge will be required to fully use all available data.
Drilling contractors are in the best position to offer an edge integration solution, and Nabors is offering such a solution. As drilling automation becomes more widely adopted, optimizing the automated processes is the next challenge.
Platforms, like RigCLOUD, that aggregate data and provide analytics to optimize processes both at the edge and the cloud are required. Additionally, use of more advanced data-based solutions in the non-drilling activities (logistics, planning, invoicing) are also use-cases for the upstream sector. Creating a data exchange or data marketplace would provide discoverability of data and ease of data consumption to accelerate the growth of analytics.
Holistically, our industry needs to make progress on data standards. To do this, we need to examine data standards and practices in isolation, then communicate the observed inefficiencies. This will enable the industry as a whole to make better use of our data, which will improve safety and cost management to ensure we stay competitive as an industry. Standardizing our data model will empower the sector and industry to continuously innovate to deliver value to all stakeholders.
Saussus: The key opportunity with advanced data analytics is in substantially improved geological interpretations of the subsurface, smarter well placement and more optimized field performance.
Onshore, there is a tremendous amount of real-time operational, geological and performance data that could be incorporated in models to improve well spacing, completions and other aspects essential to profitable, efficient, sustainable and safe drilling.
Offshore, as sensors become more affordable and common, constant data streams could be used to extend production, increase return on investment and monitor defects that could mitigate negative environmental and safety consequences. There is also a wealth of unstructured data that has been accumulating for decades.
Much of this data will remain largely underutilized in decision-making until data analytics are adopted to be able to handle not just the large volume but its dynamic nature. Data analytics will give operators the ability to move away from static toward dynamic models of the subsurface, allowing faster decisions, increased efficiency and higher profitability.
Having said that, a critical requirement that has to come first is making that data open, standard-based and application agnostic so it is accessible to begin with to allow the full power of data analytics to be realized.
Pereira: Data and data analytics are well-known topics in the oil and gas industry, but the way data analytics are being done and by whom is changing. We are talking about democratization of analytics. Some companies still need to start their digitalization journey, while others are already well on their way.
New opportunities for advancing data analytics in the upstream sector are using all contextual data while analyzing operational performance. A lot of data and even information resides in a variety of business applications or data silos. By combining and analyzing this contextual data, like maintenance management records, laboratory information, OEE data, etc., a deeper understanding of operational issues will help improve and even predict performance. The next step for that is prescriptive analytics, based on customized anomaly detection mode, created by the engineers understanding the process the best. We call that democratization of machine learning.
Most important, though, is for companies to start today with the new self-service analytics tools to create the workforce of the future who can make daily data-driven decisions and help achieve a more predictable business outcome.
2022-09-28 - Offshore Gulf of Mexico operators evacuate facilities as Category 4 Hurricane Ian makes landfall.
2022-09-28 - Devon Energy closed its previously announced acquisition of Validus Energy, increasing its production footprint in the Eagle Ford by 35,000 boe/d.
2022-09-28 - Petrobras' "New Generation" platforms, including the P-83, will have high production capacity and incorporate carbon emissions reduction technology.
2022-09-28 - TotalEnergies' Canadian oil sands assets will generate $1.5 billion of cash flow in 2022, according to CEO Patrick Pouyanne.
2022-09-28 - Also in this week's A&D Minute: Momentum Midstream establishes a leading position in the Haynesville through the acquisitions of Midcoast's East Texas business and Align Midstream.