Core Ideas Bulk density changes caused large uncertainties in neutron moisture meter calibrations. Air‐ or water‐filled annuli around access tubes simulated voids to measure error. Error magnitude depended on void size and contrast between soil and void water content. Experimental results showed much smaller errors than previous numerical simulations. Larger voids reduced sensitivity of neutron moisture meter to soil water content. Air‐ and water‐filled voids around neutron moisture meter (NMM) access tubes have been cited as sources of volumetric water content (θ v ) measurement error in cracking clay soils. The objectives of this study were to experimentally quantify this potential error stemming from (i) uncertainty in bulk density (ρ b ) sampling and (ii) the impact of air‐ and water‐filled voids. Air‐ and water‐filled voids were simulated using ∼0.6‐cm (small) and ∼1.9‐cm (large) annuli around access tubes. After NMM measurements were taken in a tightly installed access tube, either a small or large annulus was installed in the same borehole. Additional NMM measurements were taken with the annulus filled with air, and then water and ρ b and θ v were measured. The RMSE of the NMM calibration using all 11 installations was 0.02 m 3 m −3 . However, if two cores were used for calibration, the ratio of NMM‐measured θ v to in situ θ v was significantly different ( p < 0.05) from measured θ v half the time (RMSE, 0.012–0.05 m 3 m −3 ). Small air‐filled voids created drier estimates of θ v (bias, −0.039 m 3 m −3 ; p < 0.001), wherease small water‐filled voids were not significantly different from the calibration. Air‐ and water‐filled voids from larger annuli were significantly lower and higher ( p < 0.001) than core‐measured θ v , with biases of −0.068 and 0.080 m 3 m −3 , respectively. Although this work does not correct NMM‐predicted θ v to matrix θ v , it does bound NMM error under field conditions in a cracking clay soil.
Abstract Farmers, scientists, and other soil health stakeholders require interpretable indicators of soil hydraulic function. Determining which indicators to use has been difficult because of measurement disconformity, spatial and temporal variability, recently established treatments, and the effect of site characteristics on management practice differences. The North American Project to Evaluate Soil Health Measurements includes 124 sites uniformly sampled across a range of soil health management practices in North America in 2019. We compare and recommend indicators of hydraulic function that best characterize soil health. We assessed the relationship of each indicator to a suite of soil inherent properties and climate variables, the response of each indicator to soil health management practices, the effect that soil inherent properties (clay content, sand content, and pH) and climatic variables (10‐yr mean annual precipitation and temperature) had on response to management practices, and the relationship among the responses of the indicators to soil health management practices. Field capacity measured on intact cores (θ FC_INTACT ) was the best measure of soil hydraulic function, because it responded to management, represents a direct measure of soil hydraulic function, is proximal to stakeholder values, and its response to management was not significantly influenced by inherent and climatic variables. Other suitable indicators are bulk density, soil organic carbon (SOC), and aggregate stability, which are not direct measures of soil hydraulic function but do respond to management and may be practical in situations in which measuring θ FC_INTACT is not. This study informs selection of soil health indicators to measure soil hydraulic function.
Abstract In the Gulf Coastal Plains of Texas, a state-of-the-art distributed network of field observatories, known as the Texas Water Observatory (TWO), is developed to better understand the water, energy, and carbon cycles across the critical zone (encompassing aquifers, soils, plants, and atmosphere) at different spatiotemporal scales. Using more than 300 advanced real-time / near-real-time sensors, this observatory monitors high-frequency water, energy, and carbon storage and fluxes in the Brazos River corridor, which are critical for coupled hydrologic, biogeochemical, and land-atmosphere process understanding in the region. TWO provides a regional resource for better understanding and/or managing agriculture, water resources, ecosystems, biodiversity, disasters, health, energy, and weather/climate. TWO infrastructure spans common land uses in this region, including (traditional/aspirational cultivated agriculture, rangelands, native prairie, bottomland hardwood forest, and coastal wetlands). Sites represent landforms from low-relief erosional uplands to depositional lowlands across climatic and geologic gradients of central Texas. We present the overarching vision of TWO and describe site design, instrumentation specifications, data collection, and quality control protocols. We also provide a comparison of water, energy, and carbon budget across sites, including evapotranspiration, carbon fluxes, radiation budget, weather, profile soil moisture and soil temperature, soil hydraulic properties, hydrogeophysical surveys, groundwater levels and groundwater quality reported at TWO primary sites for 2018-2020 (with certain data gaps). In conjunction with various earth-observing remote sensing and legacy databases, TWO provides a master testbed to evaluate process-driven or data-driven critical zone science, leading to improved natural resource management and decision support at different spatiotemporal scales.
Plants remove carbon dioxide from the atmosphere through photosynthesis. Because agriculture’s productivity is based on this process, a combination of technologies to reduce emissions and enhance soil carbon storage can allow this sector to achieve net negative emissions while maintaining high productivity. Unfortunately, current row-crop agricultural practice generates about 5% of greenhouse gas emissions in the United States and European Union. To reduce these emissions, significant effort has been focused on changing farm management practices to maximize soil carbon. In contrast, the potential to reduce emissions has largely been neglected. Through a combination of innovations in digital agriculture, crop and microbial genetics, and electrification, we estimate that a 71% (1,744 kg CO 2 e/ha) reduction in greenhouse gas emissions from row crop agriculture is possible within the next 15 y. Importantly, emission reduction can lower the barrier to broad adoption by proceeding through multiple stages with meaningful improvements that gradually facilitate the transition to net negative practices. Emerging voluntary and regulatory ecosystems services markets will incentivize progress along this transition pathway and guide public and private investments toward technology development. In the difficult quest for net negative emissions, all tools, including emission reduction and soil carbon storage, must be developed to allow agriculture to maintain its critical societal function of provisioning society while, at the same time, generating environmental benefits.