Accuracy of methods for reporting inorganic element concentrations and radioactivity in oil and gas wastewaters from the Appalachian Basin, U.S. Based on an inter-laboratory comparison

T. L. Tasker, W. D. Burgos, M. A. Ajemigbitse, N. E. Lauer, A. V. Gusa, M. Kuatbek, D. May, J. D. Landis, D. S. Alessi, A. M. Johnsen, J. M. Kaste, K. L. Headrick, F. D.H. Wilke, M. McNeal, M. Engle, A. M. Jubb, R. D. Vidic, A. Vengosh, N. R. Warner

Research output: Contribution to journalArticlepeer-review

23 Scopus citations

Abstract

Accurate and precise analyses of oil and gas (O&G) wastewaters and solids (e.g., sediments and sludge) are important for the regulatory monitoring of O&G development and tracing potential O&G contamination in the environment. In this study, 15 laboratories participated in an inter-laboratory comparison on the chemical characterization of three O&G wastewaters from the Appalachian Basin and four solids impacted by O&G development, with the goal of evaluating the quality of data and the accuracy of measurements for various analytes of concern. Using a variety of different methods, analytes in the wastewaters with high concentrations (i.e., >5 mg L-1) were easily detectable with relatively high accuracy, often within ±10% of the most probable value (MPV). In contrast, often less than 7 of the 15 labs were able to report detectable trace metal(loid) concentrations (i.e., Cr, Ni, Cu, Zn, As, and Pb) with accuracies of approximately ±40%. Despite most labs using inductively coupled plasma mass spectrometry (ICP-MS) with low instrument detection capabilities for trace metal analyses, large dilution factors during sample preparation and low trace metal concentrations in the wastewaters limited the number of quantifiable determinations and likely influenced analytical accuracy. In contrast, all the labs measuring Ra in the wastewaters were able to report detectable concentrations using a variety of methods including gamma spectroscopy and wet chemical approaches following Environmental Protection Agency (EPA) standard methods. However, the reported radium activities were often greater than ±30% different to the MPV possibly due to calibration inconsistencies among labs, radon leakage, or failing to correct for self-attenuation. Reported radium activities in solid materials had less variability (±20% from MPV) but accuracy could likely be improved by using certified radium standards and accounting for self-attenuation that results from matrix interferences or a density difference between the calibration standard and the unknown sample. This inter-laboratory comparison illustrates that numerous methods can be used to measure major cation, minor cation, and anion concentrations in O&G wastewaters with relatively high accuracy while trace metal(loid) and radioactivity analyses in liquids may often be over ±20% different from the MPV.

Original languageEnglish
Pages (from-to)224-241
Number of pages18
JournalEnvironmental Sciences: Processes and Impacts
Volume21
Issue number2
DOIs
StatePublished - 1 Feb 2019
Externally publishedYes

ASJC Scopus subject areas

  • Environmental Chemistry
  • Public Health, Environmental and Occupational Health
  • Management, Monitoring, Policy and Law

Fingerprint

Dive into the research topics of 'Accuracy of methods for reporting inorganic element concentrations and radioactivity in oil and gas wastewaters from the Appalachian Basin, U.S. Based on an inter-laboratory comparison'. Together they form a unique fingerprint.

Cite this