Traditionally, power models that are calibrated to compute corrosion loss of steel and other metals are based on time series of measured thickness losses. The models are based on the assumption that the environmental conditions remain constant over the period of data collection. Given changes in climate and air quality laws, an assumption of statistical stationarity for thickness loss measurements is almost certainly not rational. The goal of this study was to provide a framework for detecting nonstationarity in thickness loss data and then for adjusting the measured data found to be temporally nonhomogeneous. The adjusted values would then represent a specific environmental condition over the duration of the data collection. Specific statistical tests are recommended for the detection of nonstationarity. After nonstationarity is confirmed, a procedure to adjust the measured data to a stationary record is presented. The modeling framework is illustrated using two types of data: one-year corrosion loss data of zinc due to variation in atmospheric SO 2 concentrations and steel corrosion loss rates collected over a period of time when SO concentrations varied. The results 2 show that modeling atmospheric corrosion using nonstationary loss data can lead to inaccurate estimates of thickness loss. The adjustment of measured data to a constant environmental state will be beneficial in economic analyses, risk studies, and planning for changes in air pollution laws.