CEOs and leaders of oil and gas companies hear a great deal about what a post-digital company will look like. They are bombarded with buzzwords: artificial intelligence (AI), machine learning, digital twins, Big Data, digital transformation and digital journey. What is this digital journey and how long will it take? What does it look like when companies do finally arrive? Technology is nothing new to the industry. Initially, digitalization was about efficiency. Today, however, it is about much more. There is something bigger going on, and the industry is quickly arriving at the inflection point.
The oil and gas industry has a massive amount of surface and subsurface data resources available. At this year’s CERAWeek by IHS Markit, one major oil and gas company’s CTO shared that in the company’s surface data alone, there are 3 billion datapoints generated by its facilities. But is it good to have more data? Not necessarily. Not if the data are in silos. Not if petroleum engineers and geoscientists spend half their time searching and assembling those data.
Although the industry is at the leading edge of data collection, and perhaps even data manipulation, the opportunity now is to move into the leading edge of model-based decision-making. A knowledge layer—an emerging, critical component of the digital stack—demonstrates the relationships and interdependencies between various operational concepts that are key drivers of operations, such as ships, cargo types, ports, contractual agreements and more. This comprehensive view of the business liberates subject matter experts (SMEs) from searching and assembling data to focusing on modeling higher-order problem-solving.
It is helpful to think of knowledge as the answer to a question. A knowledge layer is deployed upon an existing digital stack, and then it integrates and enhances all the human expertise, AI, machine learning and decision support tools a company already invested in and pulls through only the data requested. It leverages existing data lake investments to reason across an entire connected business. It connects the dots of the data contained in siloed sources allowing knowledge applications to support the modeling of a business function—taking into consideration both physical assets and business processes—while integrating human insights and expertise. That model can be reused to build more complex models to speed decision-making quickly, and ultimately expand and increase the speed of the enterprise’s aggregate knowledge with each decision made. Unlike individual AI solutions, a knowledge layer doesn’t put decisions back into individual functional silos or applications, but it retains the intelligence gained into a companywide knowledge repository that continues to learn and grow.
Well history—a sample knowledge application
A well’s life cycle can span decades, and the data about wells often are scattered across multiple databases, systems and reports. Wells acquired through partnerships and acquisitions may be in another data source completely different from the wells an organization drilled itself. The longevity of many wells means the technologies used to collect these data have changed over the years, leaving decades of data trapped in silos created by legacy technology. Throughout that time span, particular insights about working on the well are stored in unstructured locations such as reports, drilling lookbacks and even people’s memories. How many team members through the years had their hands in the mix? There are roughneck crews, engineers, data analysts, managers and others that may have worked in the well construction process. These well datasets encompass millions of datapoints, a figure that increases exponentially every day as more wells are drilled and new sensors are deployed to monitor operations.
For a particular use case, Maana built the WellLine application and trained its algorithms based on public data released by Equinor for the Volve Field in June 2018. Maana also leveraged data from more than 46,000 wells in the U.S. that are publicly available from the Bureau of Ocean Energy Management to build a digital knowledge layer for wells by encoding the expertise of SMEs (such as drilling, production and petroleum engineers). By indexing and building models based on current and past data, a knowledge graph for wells was produced.
After initial testing with the public datasets, Maana engaged a large multinational oil company that had a vast amount of information in almost 10,000 wellbores and brought those data into WellLine. The information comprised 4 million drilling comments, almost 50,000 documented lessons learned and nearly 100,000 documented well problems representing almost 1,000 cumulative years of knowledge and experiences related to drilling and working over wells in the region. This rich history was mined and presented to the engineers in a way that allowed them to quickly search and see the history of every wellbore and understand what did and did not work in a particular field or hole section. Having access to this historical knowledge allowed the client to avoid costly mistakes and make better-informed decisions with confidence. Over the course of four weeks, Maana deployed WellLine onto the customer’s internal Microsoft Azure environment and mined this rich history.
WellLine organized and indexed the data around common well concepts such as equipment, lithologies, well sections, formations, vendors, well activities, drilling problems, HSE events, mud additives and more. WellLine reconstructed the digital histories of these 10,000 wellbores, enabling engineers and operations teams to see all of the well events on a common time line regardless of where the data were originally stored. During the course of this project, WellLine extracted and linked together more than 15 million entities and created 4.2 million unique well events from the approximately 10,000 wellbores, creating possibly the world’s largest searchable well knowledge graph to date.
Diverse applications can be created using a knowledge layer to address an array of problems. With the question/answer methodology Maana uses, scenarios can be postulated and tested. This is helpful to go beyond what was built strictly to capture and organize historical data, because integration work is underway with companies specializing in predictive algorithms, allowing them to include a prediction time line.
Leveraging historical data
Oil and gas companies can now access valuable data that otherwise could have remained locked in dusty files or in the mind of a retired engineer. With the speed of faster, more informed decision-making, companies can capitalize on historical data and metrics to reduce downtime by potentially up to 5% and improve time to market savings of millions of dollars.
It is optimal to build a sustainable knowledge layer across the enterprise instead of putting knowledge into solely one application. Reaching this inflection point propels businesses forward, enabling thousands of decision models and supporting hundreds of use cases—this is where a knowledge platform exists in its own unique architecture platform class.
The post-digital era oil and gas company will have a knowledge layer on its digital stack. Its executives will have thousands of decision models running across their enterprise, augmenting its collective intelligence in real time. The decision velocity they experience will enable efficiency and business advantages not available to those not leveraging this new layer on the digital stack.
The well tested at about 2,100 barrels of oil per day with good crude quality, Petronas said in a statement.
Egypt is in the process of launching the Egypt Upstream Gateway, a digital subsurface platform that will act as an up-to-date repository of the country’s subsurface data, says H.E. Eng. Tarek El-Molla, Egypt’s minister of petroleum and mineral resource.
The Winterfell ILX well, designed to test a subsalt Upper Miocene prospect in Green Canyon Block 944, hit about 26 m (85 ft) of net oil pay in two intervals, Kosmos said.