[Editor's note: A version of this story appears in the July 2018 edition of Oil and Gas Investor. Subscribe to the magazine here.]

Earth’s life forms “are so amazingly primitive that they still think digital watches are a pretty neat idea.” —Douglas Adams, The Hitchhiker’s Guide to the Galaxy

A million leaves of grass sway in a computer breeze—each digitally rendered blade is complex, independent and in motion, all created in less than a thousandth of a second.

In 2014, NVIDIA, a manufacturer of graphics processing units (GPUs) for the gaming industry, created the video to flex the muscles of its graphics hardware and software. The splendorous detail adds a little more appeal for gamers with time to kill.

Just a few years later, Marathon Oil Corp. (NYSE: MRO) has found a way to do precisely the opposite with the power of such devices: save time.

The company is now augmenting traditional computer processing power—handled by a computer’s central processing unit (CPU)—with GPUs. Working in tandem, GPUs enable the company to model reservoirs in a fraction of the time traditional computers can.

For Marathon, like other oil and gas companies, speed is partly the reason behind its embrace of such technologies. Companies continuously aim for shorter cycle times, shaving days off drilling wells, completions and maintenance.

Reservoir simulations, and ways to visualize them, have become increasing important to operators. Initially, creating a detailed model for fracking—simulating different numbers of stages and different lateral lengths—could take up to six or seven days to complete, said Bruce McCullough, vice president of technology and innovation and CIO at Marathon.

“Now, it’s done in about 10% of that time, and we’re still continuing to speed it up,” he said.

The union of video game tech and oil production is perhaps a footnote in the oil and gas industry’s history of innovation. Companies have always sated their drive to find hydrocarbons with new technology. In 1921, an Oklahoma oilman funded an experiment using technology introduced during World War I to find enemy artillery, using sound waves. The resulting test in the Arbuckle Mountains created the first oil and gas reflection seismograph.

Through the first half of 2018, Investor followed the technology conversations among E&Ps, service and start-up oil-tech companies. Successes by operators in the Eagle Ford, Permian Basin, Bakken and other shale plays are in many respects still at the embryonic stage. Artificial intelligence (AI), big data and predictive analytics have made progress and have improved returns for some E&Ps, but industry leaders see a long road ahead.

Big data is increasingly seen as important, with some commentators seeing a binary choice for companies: digitize to survive, or don’t.

“Emerging technology is disrupting the status quo for the U.S. energy sector,” said Regina Mayor, KPMG’s global and U.S. energy sector leader.

KPMG’s 2018 U.S. Energy Outlook Survey suggests that most energy executives have an eye on emerging technology. About 51% of those surveyed are exploring AI and intelligent automation to improve their operations. About 32% plan to use technology to improve products and services.

As exploration and production companies continue to embrace digital—a metamorphosis toward E&P&D organizations—a blizzard of technobabble dominates: Deep Learning, Edge, Cognitive, Blockchain.

They boil down to a similar idea: leveraging data to make better decisions, either by people or machines.

Oil and gas operators are also finding a run of entrepreneurs sensing the increased appetite for digital answers. Some are looking in unusual places. Halliburton Co. (NYSE: HAL) found solutions to downhole fluid analysis through technology used in dog food processing. Marathon, among other companies, is exploring the uses of DNA sequencing in the oil field.

Breakthroughs also come from within the companies themselves. As Everett Rogers, a sociologist who studied the adoption of new technologies, wrote, “Innovations often bubbled up from the operational levels of the system, with the inventing done by users.”

Ideas have themselves taken on such importance that Marathon has segmented its research and development teams.

Marathon uses one team to focus on step-change technologies and another for bigger breakthroughs. “We have a group within the company that's within my area of responsibility that’s in place to look for breakthrough technologies that are, maybe three to five or longer years down the road? What are the technologies that maybe aren’t even oil and gas technologies that we should be leveraging?” McCullough said.

Like other industries disrupted by technology, oil and gas producers face the same challenges: when to adopt innovations, how much to spend and when to ignore them. But E&Ps also find themselves at a crossroads in which spending on expensive technology must be balanced against impatient, results-hungry investors.

The tension for E&Ps is in attending to the needs of data collection, which can be expensive, while meeting the expectations of investors, Garrett Jackson, Devon Energy Corp.’s (NYSE: DVN) vice president of drilling, completions and well construction, said at Hart Energy’s DUG Executive conference in February.

“That’s a challenge we face right now, as we’re trying to be more capital efficient and walk that fine line,” he said.

Lower 48 shale operators have led the way in adoption of many new innovations, in part due to the oil price crash that began in 2015, according to a May report by Wood Mackenzie.

The pressure to lower marginal costs has caused U.S. shale operators to hunt for efficiencies and technologies. Digital solutions have saved E&P companies about 10% on drilling and completion well costs, Wood Mackenzie said.

E&Ps’ ability to sustain cost reductions is often met with skepticism. But innovation and “digitalization is one of the biggest opportunities to achieve the next wave,” Wood Mackenzie said.

However, data is now more of a liability, with the sector’s information largely fragmented across business units, functions and companies, according to Wood Mackenzie. Business models will also need to evolve with the digital landscape, focusing more on automation while the workforce focuses on higher-value tasks.

But new technologies come unforeseen consequences.

In a 2017 report, Robert Clarke, research director for Lower 48 upstream at Wood McKenzie, said advanced technology has fueled growth in the Permian Basin. Based on its models, the Permian could reach production of more than 5 million barrels per day (MMbbl/d) by 2025. That growth would rely on new technologies such as low-temperature chemical diverters, nano-proppants and fiber optic reservoir evaluations.

The risk: high-intensity, longer laterals in close proximity could affect infield wells, with future child well EURs reduced by 30% compared to the original parent wells.

Game On

“The Time Traveler was one of those men who are too clever to be believed … you always suspected some subtle reserve, some ingenuity in ambush.” —H.G. Wells, The Time Machine

Marathon’s leveraging of GPUs is a step toward creating faster stimulation models, first in the Bakken and now across other U.S. shales plays.

Upstream companies are in the experimental stage with many technologies, which Wood Mackenzie says are evaluated based on expectations for now. However, areas where larger E&Ps are exploring innovations give clues to where they produce the greatest benefit.

“The good news about this hardware is it’s very fast, but more importantly, you can scale it up huge,” McCullough said. “Just by way of comparison, today for the reservoir modeling that we’re doing on one of our assets, I think we’ve scaled it to more than 200 processing nodes, and it processes those models at least 10 times faster than traditional models.”

In the Eagle Ford, Marathon faced a maturing asset and found small, safer and more efficient ways to operate using the digital oil field. That included real-time decision making through data dashboards and predictive alerts for potential paraffin buildups or battery failures.

Such enhancements have added a modest 2 or 3 bbl/d to each well—production that swells to several thousand barrels per day annually, with minimal cost.

Marathon’s Eagle Ford assets were a beachhead to introduce big data concepts.

“It’s not anything horribly new from a technology perspective,” McCullough said. “But the way we’ve applied it to the day-to-day operations of our Eagle Ford asset, and now our other assets as well, was certainly a breakthrough for us.”

Progress in the Eagle Ford led to revisiting the Bakken, where the company had become nearly inactive by early 2016 when the company released its only rig in the Williston Basin at the time.

But around the same time, Marathon implemented new computer modeling techniques in the Bakken, to take on the challenge of parent-child well interference.

“We’ve been able to really resurrect a legacy asset,” McCullough said. “What we’ve done is we’ve taken all our knowledge of the assets’ geology and the physics, and we’ve been applying these new technologies, these new ways to model fracks, new ways to model reservoirs. It’s yielding some pretty significant benefits.”

Marathon is now applying that to other assets “to do some more creative things. Speed is of the essence.”

The trick for all organizations is collecting the data to make those models and tools that predict problems.

Marathon has strongly emphasized data acquisition, data quality and data access in the past few years.

“Those aren’t really sexy terms,” he said. “I wish I could say that we used artificial intelligence to do that, but it’s really about putting the structure in place, to understand what are all the recently good sources of data, who needs that data, and then what do we need to do on top of that data to either clean it up and/or make sure it stays clean in our environment?”

A Shrewd Artifice

"How can it not know what it is?” —Blade Runner (1982 film)

In 1997, the world champion chess master Garry Kasparov walked away from IBM’s Deep Blue, glanced at his mother and shook his head, beaten for the first time in his career.

After steamrolling the computer the previous year, he had agreed to another set of games. As Steven Levy writes in Wired, Kasparov was sucker punched, to some extent. He entered the match believing one unranked grandmaster was working with IBM’s team. Just before playing, he learned that four were either hired or consulted for upgrading Deep Blue.

What seemed like computer wizardry was simply adding more data and more instructions from better teachers. Deep Blue learned.

AI is an all-encompassing term that seems to splinter when examined too closely. Dr. Robert Pearl, writing for Forbes, described AI as “shorthand for any task a computer can perform just as well, if not better, than humans.”

More advanced AI involves more intricate software run on more powerful computers. Machine learning solves problems using neural networks modeled on the brain, processing information swiftly. “As a result, not even the programmers can be sure how their computer programs will derive solutions,” Pearl wrote.

At the DUG Executive conference, Coleman Rowland, a partner at Deloitte, simplified AI in the oil and gas industry to essentially mean three things:

  • Automated business processes;
  • Insights derived through data analytics; and
  • Engagement with people (customers and employees)

In the oil field, AI is powered by the data a company collects and the software it uses to analyze the information. The more data, and the more powerful the hardware, the better that software can generate solutions faster and, to some extent, predict events through pattern recognition and statistics.

Big data computing, machine learning and real-time knowledge transfer are also improving geology models and wellsite selection, Wood Mackenzie said in a 2017 report. Daily completion and flowback data are used to continually adjust exploitation programs using predictive analytics. EOG Resources Inc. (NYSE: EOG) updates its entire future drilling inventory nightly, running proprietary algorithms after updating its data set with real-time operational information, the report said.

But most innovation starts small. Companies find ways to do things slightly faster, more cheaply and more reliably, Wood Mackenzie said.

James Courtier, vice president of exploration and geosciences technology at Laredo Petroleum Inc. (NYSE: LPI), said the Permian Basin company views AI as simply another tool—one that can get to an answer faster.

“As human beings, one of our flaws is that we can’t necessarily look at more than two to three variables changing at the same time,” Courtier said at the DUG Executive conference. “Machines don’t care. They can look at all of it at the same time. That’s where the power comes from.”

The complexity of drilling wells means that any changes can jumble up multiple other variables, he said.

Laredo uses sophisticated algorithms—essentially programmed rules that dictate calculations and problem-solving operations—to find patterns humans can’t detect.

“We’ve come from, in my career, paper film logs to digitization to really looking at the nuances of a lot of data,” said Randy Foutch, CEO of Laredo Petroleum said. “We’ve used it to determine what was the best way to land that lateral. We were very, very happy with the results of that.”

With a computer analyzing information, improvements can be made to completion designs, spacing, the precise point to land a well and the configuration between the wells, Courtier said.

The problem is that the data being used is now so large Laredo finds itself bumping up against the limitations of its software.

“That’s big data in and of itself,” he said.

Devon Energy has also automated some field processes and is implementing machine learning—a program that uses statistical information and other data to improve performance. But like all operators the company wants more data.

“Where we’re at today, we have a lot of data available, but it’s mainly been around the drilling side,” Jackson said at the DUG executive conference. The company is now collecting more information on the completions side.

“We used not to collect that data when we were performing completions jobs. Now it’s pretty routine that all of that data is quantified,” Jackson said. “One of the things we’ve done is tried to set up our systems internally where we can capture and record” the information for later use, but also in a way that is well organized.

“The key of where we need to go is trying to quantify the completion before we go out there and perform it,” he said. “Being able to do that real-time is kind of the Holy Grail of where we’re trying to go. How do we get there; how do we get that data?”

McCullough cautions that AI has limits.

“I think you have to be careful how you define artificial intelligence,” he said. “I doubt we’re ever going to get to a point where we can plug all the data in, and a machine will spit out the latitude and longitude of where you should drill and how many stages the frack is, and how long your laterals are.

“There’s still too much uncertainty under the ground that requires the human to apply the learnings from other areas and their own experiences,” he said.

For organizations, the evolution is being with how machine learns, but also the acceptance by human operators.

Digital Insecurity

“Any sufficiently advanced technology is indistinguishable from magic.” —Arthur C. Clarke

Predicting the future is, at best, a technological parlor trick. But technology is still able to hazard good guesses—in some cases far better than humans.

In 2016, students at MIT created software in which a computer guessed, or “hallucinated,” what might occur based on a single static picture: a train, a beach, a golf course. The software learned by using large numbers of videos to predict plausible futures.

The result: a video could create a small, one-second video of what was likely to happen.

Hess Corp. (NYSE: HES) is aiming for something similar with the Williston Basin, where the company has spent the past couple of years trying to bring people, information and materials together in the right sequence.

In Denver this April, Barry Biggs, Hess’ vice president of onshore, described efforts to watch over its operations with Exception Based Surveillance (EBS). The system looks for things out of the ordinary and relays signals to operators. When production rates drop, for instance, the computer detects the drop.

Hess also monitored the surveillance program itself with another layer of software.

“We have a machine-learning algorithm running on the top of all the production rate drop signals,” Biggs said at Hart Energy’s DUG Rockies conference.

Through machine-learning software, the company is able to predict whether rates are changing due to tubing leaks, which are responsible for about half of Hess’ production problems.

Hess’ alerts rely on machine learning to improve over time with diagnostics performed hourly to identify issues. The result: signals to operators “get smarter” as more data is processed.

“We’ve got that to the point where it’s just about 100%” correct, Biggs said.

Hess’ EBS has eliminated the need for daily visits to wellsites, decreased annual lifting costs by about 30% and successfully detected tubing leaks.

As with the Bakken Shale and the Eagle Ford, companies such as Marathon, Hess and Devon Energy are taking technological improvements to other basins, hoping to further replicate or surpass previous successes.

“We don’t send an operator out. We don’t send a checkout crew out. We send that well over to the rig group to prioritize and go out and fix it,” he said.

Through incremental steps, the company has saved 2.2 days in otherwise wasted time and headed off deferred production of about 250 barrels of oil equivalent per day (boe/d) for each well.

Ideally, Biggs said, the company would eliminate the failure to begin with and apply its data tools to optimize wells even more.

But there is a capital trade-off. “This whole economic/optimization piece is not simple, and it takes all of those components to make it work,” he said.

Hess’ next generation of EBS will attempt to use analytics to identify problems before they occur.

Digital service companies such as Ambyint are working toward autonomous well operations, promoting its technology as the oil and gas industry’s answer to the “self-driving car.”

Ambyint CEO Alex Robart, who spoke at Hart Energy’s DUG Permian conference in May, is using Edge, Internet of Things and AI/machine learning to transition to the “autonomous oil field.”

He said Ambyint is aiming for a next generation of oil and gas automation.

Ambyint counts as one of its strategic advantages a superspeed datalink able to sample information at 5 millisecond intervals and carry it along on bandwidth capable of moving 100 million operating hours’ worth of data. Robart said a typical logging system operates in 5-second intervals while others still take as much as 30 minutes to acquire, send and receive data.

The swift collection of data is combined with an Edge device—a computer roughly the size and shape of a hockey puck and akin to a smartphone.

“This device allows us to run real-time computations as well as real-time analytics,” Robart said. “You simply can’t do that with existing technology stack out there.”

In a case study, Ambyint said it partnered with a major Bakken operator to optimize 50 horizontal wells. The goal: enhance the performance of well rod pumps.

For humans, optimization is a time-consuming, repetitive task that requires veterans in the field and significant manpower. It’s often neglected.

Ambyint’s data collection revealed that just 11% of the wells were pumping as they should have been.

About 56% of the operator’s wells were overpumping, leading to increased failure, higher maintenance due to wear and tear and increased electricity use. Overpumping also meant well rod pumps were moving more than required. Adjusting them saved an estimated 59 workovers annually.

About a third of the operator’s wells were underpumping, squandering maximum production. Corrections increased well production by roughly 135 bbl/d—boosting volumes by 6% across the pilot wells.

Through machine learning, Ambyint’s software recognized patterns and anomalies and produced recommendations to increase or decrease speed. In a one-week period, the software recommended increasing speed twice, resulting in a 20-bbl/d increase in production.

Robart said that applying automated optimization to the operator’s entire 450-well field would increase annual revenue by roughly $78 million, reduce workover expenses by $3 million and save $350,000 in electricity expenses.

Robart said Ambyint is now working with Deep Learning technology to recognize downhole characteristics, anomalies and identify fluid load problems.

But he cautions that before speeding up production and optimizing efficiency, the best software and hardware still needs to be collected, which requires patience.

“We spend a lot of time educating our customers on the fact that developing data science analytics is a journey,” he said. “Customers want machine learning out of the box to deliver value. But it doesn’t work that way. It takes time and hand-in-hand work with the customers to work with the data, figure out what the data is telling you, and then deploy the analytics.”

For E&P companies, learning to trust the technology also takes time.

McCullough said the first step is to get the right data to people who still have to sign off on a machine’s assessment.

“We’re doing some pretty creative things around machine learning right now to optimize production,” McCullough said. “You could argue, that’s the conditioning phase of machine learning, where machine makes suggestions, and the human says, ‘Yeah, that makes sense,’ or, ‘It doesn’t make sense,’ and then the machine continues to learn off of those interventions that the human makes.”

McCullough said that while data scientists and statisticians are important to innovation, however, breakthroughs have come from industry veterans.

“Where we’ve hit homeruns, from a data perspective, has been leveraging our legacy reservoir engineers and geologists who have a strong data background, or a strong information technology background,” he said.

Enter The Data Scientist

“The biggest issue with a blue screen [of death] is that it’s literally a screen with a blue background and a sad face with not enough information to tell you the problem.” —WindowsCentral.com

An NBA season consists of 82 games on a schedule that, in the past, has been grueling for its players and physically demanding for tall players who typically travel on aircraft.

The 2016 to 2017 NBA schedule revolved around 30 teams playing a combined 1,230 games. Putting a schedule together creates more possible permutations than there are atoms in the universe, according to KPMG.

The international audit, tax and advisory services firm worked with professional basketball to smooth out its schedule, using 60 computer servers, running in parallel, to generate more than 32 trillion possible schedules.

Schedules matter in every industry, particularly the oil and gas sector, in which safety is of the highest order.

Regina Mayor, KPMG’s global and U.S. energy sector leader, said analytics could apply to work crew scheduling, pipeline maintenance and even the movement of hydrocarbons through pipelines.

“You sort of take these algorithms/models that were created for one purpose and then use them to address other similar processes. Scheduling, for example, is one such activity that cuts across all types of industries and companies,” she said.

Nevertheless, every efficiency gained may stir some worry that oil and gas jobs are at stake.

“There is no doubt that it creates some anxiety. But it’s not dissimilar to the challenges that we had when jobs were moved to offshore locations or the disruption we had throughout the industry. The energy industry, in particular, has been a very technologically intensive industry for a long time.”

Marathon Oil’s ability to visualize data extends to remote conferencing, enabling realtime discussions and information sharing with personnel in other offices. Displayed in Marathon’s Visionarium are wellbores and their associated landing zones.

KPMG’s survey found that 51% of executives see technology replacing tasks that enable employees to focus on strategic activities. Mayor said her clients are quick to emphasize they’re not interested in reducing headcount but in improving productivity, and ultimately, developing more acres with more production without having to scale up headcount.

The industry also finds itself still dusting off from a nasty downturn in which its labor pool was dislocated. KPMG’s survey found that 83% of executives intended to either increase or maintain their staff levels.

“The intention isn’t to further dislocate, it’s to try to make the lives of the folks who are remaining productive and fulfilling and easier,” Mayor said. “And it’s also the recognition that they’re not going to get all of those folks back.”

Studies have found that many workers have left the industry and won’t be coming back, particularly in oilfield services, she said.

KPMG is also assisting clients with next-generation enterprise resource planning (ERP) strategies, the use of big data and standardizing mineral lease payments.

“A lot of what we’re doing is just experimenting,” she said.

The newest recruit in the energy industry is the data scientist, Mayor said. KPMG’s own group, called the “Lighthouse,” is staffed with experts working to create algorithms and models that will predict behavior or outcomes accurately. “Every major oil company that I’m working with, they’re all hiring data scientists,” she said.

For the onshore oil company, the streamlining of mineral rights payments is an area in which mundane tasks are being automated.

“If you miss a lease payment, there are serious repercussions,” she said. “That could be a prime acre that you were counting on, and then suddenly you might be in a position where you’re having to lose it.”

Creating authorization for expenditure (AFE) letters is also particularly tedious. AFEs are typically form letters sent with changes in dollar amounts, the legal designation of a parcel and the amount of an expenditure. Parts of the letters are changed after people pull data from land, ERP and customer contact systems.

“We built a bot that takes care of that in seconds, not days,” Mayor said.

In another case, KPMG worked with a large energy company to improve safety. KPMG fed an algorithm the company’s safety statistics along with a buffet of other variables—number of people working, weather, land area, the different trade skills involved—then tested it against historical incidents.

“We found that we could predict with greater than 95% accuracy when an incident would take place based on the actual data,” Mayor said. “We could prevent more people getting hurt by using data to help us manage the crews better and the way we’re building these mega projects.”

Stack Overflow

“You may conquer the air but the birds will lose their wonder and the clouds will smell of gasoline.” —Inherit the Wind (1955 play)

In 1943, two University of Iowa researchers decided to study why two communities had taken so long to adopt a new, hybrid corn.

Despite increasing crop yields by 20%, farmers took an average of seven years to fully adopt the new corn. Overall, 13 years passed before everyone had planted the crop.

Among the reluctant farmers was Everett Rogers’ father, who similarly delayed planting the new type of corn.

As a university professor, Rogers went on to formulate the general categories of technology adopters. The first are the innovators, often commanding little respect. They are followed by early adopters, then the majority—some entering early, some later—and finally the holdouts, or laggards.

Even disruptive technologies take a long time to implement before they are fully adopted. Many ideas also come and go, with companies unable to discern their benefits. Changing that mindset, and moving into the digital age, remains a daunting task for the industry.

“If you think about the iPhone, the iPhone came out 10 or 11 years ago, but you didn’t see the immediate adoption of smartphones,” Courtier said in February. “It took some time to gain credibility, to gain credence, to gain popularity. I think the same is true for this type of technology. It will take time.”

Innovation necessarily starts on a small scale, Wood Mackenzie noted. In the energy sector, that’s partly due to the quality of data. Many of the benefits of technology have so far been incremental, compounding into benefits over time, like interest on a loan.

For the oil and gas industry, the depth and breadth of data may eventually become the difference between winners and losers, Dave Pursell, then a managing director of investment banking at Tudor, Pickering, Holt & Co., said at the February conference.

For the past few years, the oil and gas industry has appropriated innovations by simply looking across a lease line. A competitor with more sand on a location and more frack pumps was easy to grasp.

“When I look over the lease line and I don’t see anything but a fiber optic truck and some guys in white lab coats collecting and doing DNA sequencing on the crude,” Pursell said.

“True story,” he said, adding that adding computing power and data scientists are not something that just anyone can replicate.

In March, Ajay Kshatriya, CEO of Biota Technology, sat in the speaker’s room at Marathon Oil Tower in Houston, where he’d addressed innovation and partnerships at a Society of Petroleum Engineers (SPE) conference.

The concept of DNA testing in the oil patch was met, many times, with shut doors from uninterested companies, he said.

After reorienting his approach, Kshatriya began underscoring the “so what factor” for E&P companies. DNA is good at knowing where things are and where they’ve been. Below the ground, oil is not a homogenous liquid but marked by genetic changes based on where it’s found in stratigraphic intervals.

“A big unmet need that we help address is where you’re getting your production from and how your wells are connected,” he said. “The way I describe the economic principle here is, ideally you have a six-pack of coke and you want to put one straw in every can. That would maximize your recovery.”

Kshatriya said another key to success was implementation.

“It has to be super easy, in the sense of no safety risk, no operational risk,” he said. “It’s not changing well performance. You’re saying, ‘OK, you’re drilling this well next week, put us on it.’”

In March, Biota had so far conducted tests in 450 wells, including 200 in the Permian. Among Biota’s customers are Marathon Oil and Anadarko Petroleum Corp.

McCullough said the jury’s still out on DNA technology, which the company has been piloting.

“It’s caused us to look at the problem in a different way and say, ‘Are there other physical properties that are coming from the well that we can look at?’ It’s really calling into play more unconventional technologies than what we’ve typically used in the oil and gas space.”

At the SPE conference, Mark Davidson, a senior technology director at Halliburton, said the market is at multiple stages of development and adoption. The key is to recognize how far along an idea is as a solution.

“There’s certainly appetite at any stage where the value appears compelling enough for us to make investments in those different ways,” he said. “I’ll add, one other thing is, we certainly are developing our own ideas, but just as active, we are scouting for external ideas.”

In the case where Halliburton’s method for downhole fluid analysis used a technology found in dog food processing, the application to oil and gas was wide.

“It’s being used in our products every day,” Davidson said. “We’re using it to help our customers do downhole fluid analysis—lab quality analysis without pulling the sample out of the ground.”

McCullough said the company’s first foray into big data hasn’t really been that big. “Most companies in our peer groups have what I essentially call medium data,” he said. “It’s not google amounts of information that they’re processing. By the same token, it still has to be extremely fast, and extremely reliable.”

What’s equally important is how often information is pulled from systems and how it’s assembled—in real time or from days or weeks in the past.

“It’s really leveraging that in memory computing that people equate to big data, in terms of being able to have all that data at light speed,” he said. “It’s all in memory, and you can process it almost instantaneously, but then it feeds out to the different users.”

Pursell is concerned that E&P companies, despite their best efforts, are not collecting the right kind of data. He said companies drill 10,000-foot lateral wells with 50 stages, with millions of pounds of sand and millions of gallons of water.

“So what are we measuring when we flow them back? We’re measuring oil, gas and water rates and pressure at the surface,” he said. “There’s no big data there. To call that medieval is an insult to the Middle Ages.”

Five years from now, operators may look back and laugh at “how little data we’re actually collecting,” he said.

Darren Barbee can be reached at dbarbee@hartenergy.com.