An international science team assessed predictions from multiple oil spill models and found that subsea dispersants used during response to a simulated accidental blowout may reduce oil droplet size by at least one order of magnitude.
This size reduction contributed to a one to two order of magnitude distance increase to where oil surfaced downstream from the blowout site, potentially leading to less oil reaching the surface. They published their findings in the July 2015 Marine Pollution Bulletin: Intercomparison of oil spill prediction models for accidental blowout scenarios with and without subsea chemical dispersant injection.
Responders’ decision-making processes for a spill event include predictions from integrated oil spill models to help weigh the costs and benefits of response tactics. During the 2010 spill, responders used subsea dispersant injection to reduce the surface slick, which could improve air quality in the response zone and reduce coastline oiling. This study evaluated dispersant effectiveness by comparing predictions from several academic and industry modeling systems for fourteen theoretical scenarios at three spill stages: formation of oil droplets at the initial release point; vertical rise through nearfield plumes where one or more oil intrusion layers form and enter the water column; and farfield transport where oil particles are transported with the ocean currents.
“Modelers have access to diverse experiment data that they can use to validate model predictions,” explained study author Scott Socolofsky. “This study is unique because we ran multiple blowout scenarios on five representative models and compared their predictions at different oil and gas transport stages to better understand the variation between models.”
Predicted oil droplet sizes were 0.01 – 0.8 millimeters with dispersant compared to 0.3 – 6.0 millimeters without dispersant for the theoretical spill case of 20,000 barrels per day. Dispersant used at the source point was assumed to yield a 200-fold reduction on the interfacial tension between oil and water, resulting in this significantly decreased droplet size. These smaller dispersed droplets were predicted to surface farther downstream from the wellhead, with distances increasing from hundreds of meters to tens of kilometers.
Most nearfield plume models predicted a single intrusion layer a few hundred meters above the release point. For deepwater cases, the authors’ models predicted for the seawater flow rate in this intrusion would be between 1000 – 5000 m3/s, which is within an order of magnitude of the Mississippi River flow as it enters the Gulf. In shallow cases, hydrocarbons traveled directly to the sea surface, which could be significant for accidents in less than 200 meters depth.
Variations in farfield model predictions were larger, reflecting cumulative differences in simulated droplet size and nearfield plume behavior. Models that predicted early oil surfacing assumed larger droplet sizes and did not include dissolution and biodegradation, while models predicting later oil surfacing included these factors. There was no consensus on the mass flux to the surface because of wide-ranging variable choices. However, models that included fate processes, such as oil suspended in the water column or being dissolved or biodegraded during transit, predicted up to 95% of the oil would not surface within 14 days of release when subsea dispersants were used.
Socolofsky said that this study affirmed the common adage that ‘droplet size is everything’ when predicting oil and gas fate. The team noted that limitations of available field data and wide variations in predictions highlighted the need for new models, validation data, and data interpretation methods. “Our understanding of droplet generation at a blowout has improved through work funded by GoMRI and other agencies,” said Socolofsky, “but scaling laboratory results to field conditions remains a major challenge.”
The study’s authors are Scott A. Socolofsky, E. Eric Adams, Michel C. Boufadel, Zachary M. Aman, Øistein Johansen, Wolfgang J. Konkel, David Lindo, Mads N. Madsen, Elizabeth W. North, Claire B. Paris, Dorte Rasmussen, Mark Reed, Petter Rønningen, Lawrence H. Sim, Thomas Uhrenholdt, Karl G. Anderson, Cortis Cooper, and Tim J. Nedwed.
This research was made possible in part by a grant from the Gulf of Mexico Research Initiative (GoMRI) to the Gulf of Mexico Integrated Spill Response Consortium (GISR) and the Center for Integrated Modeling and Analysis of Gulf Ecosystems (C-IMAGE). Other funding sources included the API Joint Industry Task Force D3 Subsurface Dispersant Injection team.
The Gulf of Mexico Research Initiative (GoMRI) is a 10-year independent research program established to study the effect, and the potential associated impact, of hydrocarbon releases on the environment and public health, as well as to develop improved spill mitigation, oil detection, characterization and remediation technologies. An independent and academic 20-member Research Board makes the funding and research direction decisions to ensure the intellectual quality, effectiveness and academic independence of the GoMRI research. All research data, findings and publications will be made publicly available. The program was established through a $500 million financial commitment from BP.