The growing amount of oil and gas production in the U.S. continues to spur an ongoing interest in mergers and acquisitions. Everyone, it seems, wants in—and buying a company or merging with another are two common options. But behind the scenes, these acquisitions pose unique challenges for IT staff and consultants who are often responsible for integrating data within a wide range of sources and differing technologies, in many cases when one system is more dated than the other.

Merging systems and process data from two organizations that combine to become one poses an enormous challenge to new managements, says Glenn Vlass, vice president at CartoPac Inc.

Technology continues to improve to allow users to collect, store data and access it in a meaningful and convenient way. “We’re always looking for ways to improve the integration process,” he says.

The task of integrating two or more technologies has created a field itself within information technology. “What we see right now is that there are many components of technology that are separated, and there is a tremendous opportunity to bring this technology that makes them useable across the organization,” he says.

In particular, graphic information systems (GIS) can be challenging to the integration of two different organizations because it is an evolving field. For many midstream operators, GIS is a primary technology chosen as the portal for data collection, access and management across the organizations.

“Operators are continually asking, ‘Where is our infrastructure? What is around it? How do we manage the risks associated with that pipeline? How do I effectively communicate with other organizations that are working around our assets,’” he says. An effective information management system helps them do that.

Vlass adds that one common attribute of technology used in oil and gas companies, and their midstream counterparts, is an ever-increasing need for a spatial component. Managers need to know the location of their assets. A spatially accurate GIS system is also an integral part of many midstream operators’ maintenance and safety program. The goal for many midstream operators is zero accidents, which requires great operational efficiency and plans to achieve the goal.

Operators need to know exactly where their assets are and the history of all maintenance done to them so far—and they need to be able to access that information in a timely fashion, he says.

A mature GIS system not only tells operators where the assets are, but also allows descriptive and useful information about the asset itself and the environment around it. For example, midstream operators need to know the regulatory environment around a proposed pipeline route. “There needs to be confidence within the organization about where an asset is located,” he says.

image -worker using the GIS system near field

A spatially accurate GIS system, used here as a hand-held device tracking assets near a crop field, is an integral part of many midstream operators’ maintenance and safety programs.

Data versus information

Vlass makes a distinction between data and information. Data is easy to collect and there is more data out there than can be consumed. More data is being generated at a staggering rate. Information, on the other hand, is something that is meaningful to individuals in the organization. “There is a process of turning data into useful information that helps us do a better job of making decisions,” he says.

To collect data and make it into useful information, managers and their information technology (IT) staff need to have a systematic and standard process for collecting and storing data. Once a process has been clearly defined, managers need a system with the ability to track progress as it is made.

The database management and GIS system needs to define a procedure that enables field workers to look at the infrastructure around the assets, power, roads, land and other information. The process needs to define a procedure for collecting data from the field and getting it back to a central location where it can be accessed by anyone who needs to see it.

Data collection from the field has to be collected in a systematic fashion if it can be aggregated in any meaningful way. It also has to be very easy to use and show a significant benefit to the field users for them to readily adopt the new technology and workflow.

“If I am going to use a lot of people to collect information, I have to do it in a standard format so that I can easily train those people and get that data back to a central location and get value out of it very quickly,” he says.

Vlass illustrated his point with a hypothetical example. He wants to count all of the cars in a multi-storied parking garage. Given the enormity of the task, he sends out several data collectors in the field to help him with the project. One field worker brings back a list of vehicle identification numbers of cars on the second and third floor. A second counted empty parking spaces and gave an estimate of those cars which remained. Another worker came back with a long list of license plate numbers. Another worker brings back a list of make and model numbers.

In this example, the data was collected in such a haphazard function that putting it together in a meaningful way is almost impossible. “I don’t have enough hours in the day to do that,” he says.

The process itself requires time, and managers need to keep an eye on the time used by field and office staff as they define the process for collecting and storing data. “They need to ask what is the time to complete that process?” Frequently, the process of assembling databases starts in an office and then goes to the field. Many operators are being asked to do more with less, and technology has to deliver value that improves productivity and lowers overall cost.

The price of information and efficient access to it is a key question that midstream operators invariably ask, although firm answers can vary considerably, Vlass says. “What we encourage operators to do is to clearly define the success factors and how success will be measured. These factors cannot be defined in a vacuum but are best defined with a cross-functional team with representatives from the field, GIS, IT and management,” he says.

The time it takes to integrate two or more systems can vary considerably. Most organizations have some components in place and have a significant investment in the technology. One of the key considerations is how to get the technology to work for a given organization. Can the organization use an off-the-shelf solution or does it require a custom solution? Development can be long and expensive; configuration should be faster, Vlass says.

“We try to start with a pilot project that should take 90 days or less. The cost for that is less than $50,000 and has prerequisites on existing technology components,” he says. In that short timeframe, the customer gets to see results with their people in their environment and get an education on what is possible and can contribute to the final solution requirements. Once the pilot is complete, then the solution is delivered in phases. Each phase delivers functionality that is part of the success criteria.

“It takes time to put these processes in place and to look at automation and standards. Once the foundation is established, operators can see significant improvements in operational efficiency and make minor adjustments to see greater value,” he says.

Usually, one system is older than the other. “I am still surprised by how much data collection is still being done on paper,” he says. Paper collection is troublesome for more than one reason. First, it usually needs to be transcribed from one source to another, which often leads to errors. Second, paper is often harder to access in large quantities than data stored in an organized database. Finally, paper collections often lead to “data silos,” where information is stored in one division of an organization, but is not accessible to anyone outside of that division. “Some organizations never get the value out of that data,” he says.

Many digital solutions create data silos because they were developed prior to the new developments and use of server technologies. “That is a huge issue when we look at these filing cabinets. It’s staggering when we find out how many times the same information has been collected again and again and again within an organization. It should be—‘collect-once and use-many,’” he says.