E&P activities off the coasts of Texas, Louisiana, Mississippi and Alabama are regulated by the U.S. Bureau of Ocean Energy Management (BOEM), under the authority of the Secretary of the Interior. If an E&P plan has projected pollutant emissions greater than BOEM’s “exemption threshold” for the plan (a factor related to the distance of the plan offshore), it requires an air quality review. These air quality reviews employ sophisticated modeling of pollutant transport and dispersion to determine their effect onshore. The offshore E&P community performing air quality reviews has been anticipating a significant shift in approved pollutant modeling technology that will transform a typical air quality review into a highly complex computational analysis. This is due to a concerted push by regulators (both onshore and offshore) to evaluate the effects of a complicated set of pollutants called secondary pollutants.
Secondary pollutant effect
An offshore E&P air quality review under BOEM is required to examine so-called “primary” pollutant effects onshore. In other words, one reviews the increase in the concentration of pollutant “X” onshore due to sources of pollutant “X” emissions within a project. However, expanded reviews of “secondary” pollutants are likely in store for offshore projects in the not-so-distant future (Figure 1).
Secondary pollutants are produced by chemical interactions of primary pollutants and other compounds that already exist in the ambient air. One example of a secondary pollutant is ozone (O3), which can be created from the combination of nitrogen oxides (NOx) and volatile organic compound (VOC) emissions from a project (i.e., the primary emissions), other ambient gases like oxygen (O2) and sunshine. To assess a project’s quantitative effect of increasing ozone onshore, one would need to model the project’s primary emissions using a model that accounts for the in situ chemical reactions of the emissions while they disperse and are transported by the wind.
Quantum shift of complexity
Regulatory agencies appear to be moving toward a modeling approach called photochemical grid modeling (PGM). This technology, while a significant advance in the dynamic representation of long-range transport and chemical interactions of a project’s emissions, is a quantum shift of complexity for compounding reasons—sophisticated encoded chemical mechanisms using a substantial amount of input data requiring more extensive computational resources.
The chemical reactions that comprise the pathway of primary pollutants (e.g., NOx) to secondary pollutants (e.g., O3) are complex and involve feedback reactions (NOx and O2 gas leading to the creation of other NOx compounds that in turn could produce more O3) that are accounted for in a PGM. Today’s PGMs involve the consideration of chemicals generated from lightning strikes, sea salt, windblown dust, biogenic emissions and other highly dynamic and nontrivial interactions within the atmosphere. By contrast, primary pollutant modeling considers very basic or no chemistry and often only evaluates the transport and dispersion of the emitted pollutant across the model domain (Figure 2).
Robust 3-D gridded datasets of meteorological information (e.g., winds, temperature, humidity and pressure) also are required in PGMs to drive the downflow chemistry and mix the ambient air. These data should span multiple years to represent the climatological forcing of the area of interest and requires use of independent meteorological models (themselves a set of complex numerical programs) to generate the necessary inputs to PGMs. By contrast, primary pollutant modeling often only uses scattered weather observations over the area of interest.
The complex chemistry considered in a PGM requires significant input data of regional emissions. National emissions datasets are freely available for past years that can be leveraged to represent the concentration of other chemically reactive compounds in the ambient air. Figure 3 shows typical emission sources used as input to PGM systems for a sample E&P project in the Gulf of Mexico. By contrast, primary pollutant modeling often needs just the project’s own emissions and discrete background monitor values.
The complex chemistry and copious input data of a PGM system all naturally lead to a large draw on computational resources to handle the necessary data processing and model iteration. Some industry practitioners estimate that roughly 10 times the amount of processing power, memory and storage space needed for standard primary pollutant modeling is required to run a PGM. While this seems daunting, today’s cloud computing environments, such as Microsoft’s Azure or Amazon’s Web Services clouds, are available and ameliorate much of the challenge of the increase of necessary resources.
The coming shift to the BOEM-regulated offshore E&P market in approved air pollution modeling technology will usher in potential advances in the science and understanding of offshore emission effects over land. However, along with those possible steps toward an improved scientific representation of air pollution, there are certain steps toward increased complexity and essential resources with protracted schedules that offshore operators will need to navigate.
Have a story idea for Offshore Solutions? This feature highlights technologies and techniques that are helping offshore players overcome their operating challenges. Submit your story ideas to Group Managing Editor Jo Ann Davy at email@example.com.
Spudded in March, the Merlin-1 well is targeting 645 million barrels of prospective resource, 88 Energy says.
Energy scholar Robert Bryce offers an unabashed view of the shale revolution, climate change and the future of energy. Spoiler alert: don’t expect oil and gas to disappear anytime soon.
An expert panel of judges has selected the top 18 industry projects that open new and better avenues to the complicated process of fi nding and producing hydrocarbons around the world.