Artificial intelligence may be all the rage, but computing’s axiom of “garbage in, garbage out” still applies—AI is only as good as the source data it uses.

Investing in “right to left thinking” when planning a data program can help ensure a project’s success. Lifecycle information management (LIM) calls for thinking about what the data will be used for at the end of the project, along with the desired outcomes, Shirley Ike, global director for digital consulting at Wood Plc, said at Hart Energy’s recent DUG Appalachia Conference in Pittsburgh.

She said data has picked up the moniker of “AI currency.”

“You need information you can trust from planning to operations. Your asset produces a lot of information, and that requires a lot of sensitivity in order to manage this data in order to retain its value,” she said.

Traditionally, there has been a lot of “silo working,” with data not always accessible by those who need it.

“You don't have a single source of truth,” she said. “That means that you have a lot of contractors who are coming in, and there’s no data transfer from one stage to another, which can lead to a lot of rework and lack of visibility, and that actually delays when you are ready to start to operate.”

Employing an LIM approach requires some up-front effort to configure and collect data, she acknowledged.

“Spending a little bit more effort at the start of your project allows you to reduce the cost and time spent in data management, and you get to a place where you’re ready to operate much quicker and at a low cost,” Ike said. 

Part of the up-front work in an LIM approach includes creating class libraries in which data standards and requirements are defined. Once those standards are set, it’s time to consider how to collect information, ensure consistency and optimize data entry. A formalized procedure ensures everyone who works with the data understands the processes at work. 

Finally, to collect data, a platform is created that can pull information from various systems, extract, transform and automate.

Often at the start of a project, there are gaps in available information, making it difficult to manage assets optimally, she said. There can be a lot of data scraping required, and some data exists only on paper. AI can play a major role in data cleaning and scraping.

“We sit with a lot of the stakeholders and find out where that information actually resides. There’s a lot of scraping, there is a lot of initial work that might go into that, but a lot of [clients] also just have the data. It’s somewhere, you just have to find it.”

She said a customer in the North Sea wanted to reduce offshore visits and maintenance backlog.

At the outset, they had poor quality data across different systems, which required “a lot of data cleaning” before a digital twin could be built and linked to the different systems, she said.

The LIM project made it possible for the customer to virtually walk through the asset, click on a pipe or valve and have access to “every single piece of information that pertains to … particular equipment,” she said. With that information, it is possible to plan a maintenance visit remotely, planning for the equipment that should be taken offshore.

Ike said the clarity in data made it possible for the client to reduce offshore visits by as much as 500 per year. The result: $9 million in savings following an investment of about $1 million. Ongoing savings were expected to be about $7.5 million annually, she added. 

“Having access to data, a world of opportunities on things that you can actually achieve for your asset,” Ike said. “I think of it as there’s money left behind and it’s buried in this data.”