Assessing PM2.5 Exposures with High Spatiotemporal Resolution across the Continental United States

Qian Di, Itai Kloog, Petros Koutrakis, Alexei Lyapustin, Yujie Wang, Joel Schwartz

Research output: Contribution to journalArticlepeer-review

358 Scopus citations


A number of models have been developed to estimate PM2.5 exposure, including satellite-based aerosol optical depth (AOD) models, land-use regression, or chemical transport model simulation, all with both strengths and weaknesses. Variables like normalized difference vegetation index (NDVI), surface reflectance, absorbing aerosol index, and meteoroidal fields are also informative about PM2.5 concentrations. Our objective is to establish a hybrid model which incorporates multiple approaches and input variables to improve model performance. To account for complex atmospheric mechanisms, we used a neural network for its capacity to model nonlinearity and interactions. We used convolutional layers, which aggregate neighboring information, into a neural network to account for spatial and temporal autocorrelation. We trained the neural network for the continental United States from 2000 to 2012 and tested it with left out monitors. Ten-fold cross-validation revealed a good model performance with a total R2 of 0.84 on the left out monitors. Regional R2 could be even higher for the Eastern and Central United States. Model performance was still good at low PM2.5 concentrations. Then, we used the trained neural network to make daily predictions of PM2.5 at 1 km × 1 km grid cells. This model allows epidemiologists to access PM2.5 exposure in both the short-term and the long-term.

Original languageEnglish
Pages (from-to)4712-4721
Number of pages10
JournalEnvironmental Science and Technology
Issue number9
StatePublished - 3 May 2016

ASJC Scopus subject areas

  • General Chemistry
  • Environmental Chemistry


Dive into the research topics of 'Assessing PM2.5 Exposures with High Spatiotemporal Resolution across the Continental United States'. Together they form a unique fingerprint.

Cite this