Dynamic changes in live fuel moisture (LFM) and fuel condition modify fire danger in shrublands. We investigated the empirical relationship between field‐measured LFM and remotely sensed greenness and moisture measures from the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and the Moderate Resolution Imaging Spectrometer (MODIS). Key goals were to assess the nature of these relationships as they varied between sensors, across sites, and across years. Most AVIRIS‐derived measures were highly correlated with LFM. Visible atmospherically resistant index (VARI) and visible green index (VIg) outperformed all moisture measures. The water index (WI) and normalized difference water index (NDWI) had the highest correlations of the moisture measures. All relationships were nonlinear, and a linear relationship only applied above a 60% LFM. Changes in the fraction of green vegetation (GV) and nonphotosynthetic vegetation (NPV) were good indicators of changes in fuels below the 60% LFM threshold. AVIRIS‐ and MODIS‐derived measures were highly correlated but lacked a 1:1 relationship. MODIS‐derived greenness and moisture measures were also highly correlated to LFM but generally had lower correlations than AVIRIS and varied between sites. LFM relationships improved when data were pooled by functional type. LFM interannual variability impacted relationships, producing higher correlations in wetter years, with VARI and VIg showing the highest correlations across years. Lowest correlations were observed for sites that included two different functional types or multiple land cover classes (i.e., urban and roads) within a MODIS footprint. Higher correlations for uniform sites and improved relationships for functional types suggest that MODIS can map LFM effectively in shrublands.
Abstract As human exposure to hydroclimatic extremes increase and the number of in situ precipitation observations declines, precipitation estimates, such as those provided by the Integrated Multisatellite Retrievals for Global Precipitation Measurement (GPM) (IMERG) mission, provide a critical source of information. Here, we present a new gauge-enhanced dataset [the Climate Hazards Center IMERG with Stations (CHIMES)] designed to support global crop and hydrologic modeling and monitoring. CHIMES enhances the IMERG Late Run product using an updated Climate Hazards Center (CHC) high-resolution climatology (CHP clim ) and low-latency rain gauge observations. CHP clim differs from other products because it incorporates long-term averages of satellite precipitation, which increases CHP clim ’s fidelity in data-sparse areas with complex terrain. This fidelity translates into performance increases in unbiased IMERG late data, which we refer to as CHIME. This is augmented with gauge observations to produce CHIMES. The CHC’s curated rain gauge archive contains valuable contributions from many countries. There are two versions of CHIMES: preliminary and final. The final product has more copious and better-curated station data. Every pentad and month, bias-adjusted IMERG late fields are combined with gauge observations to create pentadal and monthly CHIMES prelim and CHIMES final . Comparisons with pentadal, high-quality gridded station data show that IMERG late performs well ( r = 0.75), but has some systematic biases which can be reduced. Monthly cross-validation results indicate that unbiasing increases the variance explained from 50% to 63% and decreases the mean absolute error from 48 to 39 mm month −1 . Gauge enhancement then increases the variance explained to 75%, reducing the mean absolute error to 27 mm month −1 .
Amazonia has experienced large-scale regional droughts that affect forest productivity and biomass stocks. Space-borne remote sensing provides basin-wide data on impacts of meteorological anomalies, an important complement to relatively limited ground observations across the Amazon's vast and remote humid tropical forests. Morning overpass QuikScat Ku-band microwave backscatter from the forest canopy was anomalously low during the 2005 drought, relative to the full instrument record of 1999-2009, and low morning backscatter persisted for 2006-2009, after which the instrument failed. The persistent low backscatter has been suggested to be indicative of increased forest vulnerability to future drought. To better ascribe the cause of the low post-drought backscatter, we analyzed multiyear, gridded remote sensing data sets of precipitation, land surface temperature, forest cover and forest cover loss, and microwave backscatter over the 2005 drought region in the southwestern Amazon Basin (4°-12°S, 66°-76°W) and in adjacent 8°x10° regions to the north and east. We found moderate to weak correlations with the spatial distribution of persistent low backscatter for variables related to three groups of forest impacts: the 2005 drought itself, loss of forest cover, and warmer and drier dry seasons in the post-drought vs. the pre-drought years. However, these variables explained only about one quarter of the variability in depressed backscatter across the southwestern drought region. Our findings indicate that drought impact is a complex phenomenon and that better understanding can only come from more extensive ground data and/or analysis of frequent, spatially-comprehensive, high-resolution data or imagery before and after droughts.
Normalized Difference Vegetation Index (NDVI) and Normalized Difference Water Index (NDWI) were compared for monitoring live fuel moisture in a shrubland ecosystem. Both indices were calculated from 500 m spatial resolution Moderate Resolution Imaging Spectroradiometer (MODIS) reflectance data covering a 33‐month period from 2000 to 2002. Both NDVI and NDWI were positively correlated with live fuel moisture measured by the Los Angeles County Fire Department (LACFD). NDVI had R 2 values ranging between 0.25 to 0.60, while NDWI had significantly higher R 2 values, varying between 0.39 and 0.80. Water absorption measures, such as NDWI, may prove more appropriate for monitoring live fuel moisture than measures of chlorophyll absorption such as NDVI.
Accurate and operational indicators of the start of growing season (SOS) are critical for crop modeling, famine early warning, and agricultural management in the developing world. Erroneous SOS estimates–late, or early, relative to actual planting dates–can lead to inaccurate crop production and food-availability forecasts. Adapting rainfed agriculture to climate change requires improved harmonization of planting with the onset of rains, and the rising ubiquity of mobile phones in east Africa enables real-time monitoring of this important agricultural decision. We investigate whether antecedent agro-meteorological variables and household-level attributes can be used to predict planting dates of small-scale maize producers in central Kenya. Using random forest models, we compare remote estimates of SOS with field-level survey data of actual planting dates. We compare three years of planting dates (2016–2018) for two rainy seasons (the October-to-December short rains, and the March-to-May long rains) gathered from weekly Short Message Service (SMS) mobile phone surveys. In situ data are compared to SOS from the Water Requirement Satisfaction Index (SOSWRSI) and other agro-meteorological variables from Earth observation (EO) datasets (rainfall, NDVI, and evaporative demand). The majority of farmers planted within 20 days of the SOSWRSI from 2016 to 2018. In the 2016 long rains season, many farmers reported planting late, which corresponds to drought conditions. We find that models relying solely on EO variables perform as well as models using both socio-economic and EO variables. The predictive accuracy of EO variables appears to be insensitive to differences in reference periods that were tested for deriving EO anomalies (1, 3, 5, or 10 years). As such, it would appear that farmers are either responding to short-term weather conditions (e.g., intra-seasonal variability), or longer trends than were included in this study (e.g., 25–30 years), when planting. The methodologies used in this study, weekly SMS surveys, provide an operational means for estimating farmer behaviors–information which is traditionally difficult and costly to collect.
The 2010 BP Deepwater Horizon (DWH) oil spill damaged thousands of km2 of intertidal marsh along shorelines that had been experiencing elevated rates of erosion for decades. Yet, the contribution of marsh oiling to landscape-scale degradation and subsequent land loss has been difficult to quantify. Here, we applied advanced remote sensing techniques to map changes in marsh land cover and open water before and after oiling. We segmented the marsh shorelines into non-oiled and oiled reaches and calculated the land loss rates for each 10% increase in oil cover (e.g. 0% to >70%), to determine if land loss rates for each reach oiling category were significantly different before and after oiling. Finally, we calculated background land-loss rates to separate natural and oil-related erosion and land loss. Oiling caused significant increases in land losses, particularly along reaches of heavy oiling (>20% oil cover). For reaches with ≥20% oiling, land loss rates increased abruptly during the 2010-2013 period, and the loss rates during this period are significantly different from both the pre-oiling (p < 0.0001) and 2013-2016 post-oiling periods (p < 0.0001). The pre-oiling and 2013-2016 post-oiling periods exhibit no significant differences in land loss rates across oiled and non-oiled reaches (p = 0.557). We conclude that oiling increased land loss by more than 50%, but that land loss rates returned to background levels within 3-6 years after oiling, suggesting that oiling results in a large but temporary increase in land loss rates along the shoreline.