Time-averaged shear wave velocity over the upper 30 m of the earth’s surface ( V S30 ) is a key parameter for estimating ground motion amplification as both a predictive and a diagnostic tool for earthquake hazards. The first-order approximation of V S30 is commonly obtained through a topographic slope–based or terrain proxy due to the widely available nature of digital elevation models. However, better-constrained V S30 maps have been developed in many regions. Such maps preferentially employ various combinations of V S30 measurements, higher-resolution elevation models, lithologic, geologic, geomorphic, and other proxies and often utilize refined interpolation schemes. We develop a new hybrid global V S30 map database that defaults to the global slope-based V S30 map, but smoothly inserts regional V S30 maps where available. In addition, we present comparisons of the default slope-based proxy maps against the new hybrid version in terms of V S30 and amplification ratio maps, and uncertainties in assigned V S30 values.
For additional information, contact: Contact Information, Menlo Park, Calif. Office—Earthquake Science Center U.S. Geological Survey 345 Middlefield Road, MS 977 Menlo Park, CA 94025 http://earthquake.usgs.gov/ VS30, the time-averaged shear-wave velocity (VS) to a depth of 30 meters, is a key index adopted by the earthquake engineering community to account for seismic site conditions. VS30 is typically based on geophysical measurements of VS derived from invasive and noninvasive techniques at sites of interest. Owing to cost considerations, as well as logistical and environmental concerns, VS30 data are sparse or not readily available for most areas. Where data are available, VS30 values are often assembled in assorted formats that are accessible from disparate and (or) impermanent Web sites. To help remedy this situation, we compiled VS30 measurements obtained by studies funded by the U.S. Geological Survey (USGS) and other governmental agencies. Thus far, we have compiled VS30 values for 2,997 sites in the United States, along with metadata for each measurement from government-sponsored reports, Web sites, and scientific and engineering journals. Most of the data in our VS30 compilation originated from publications directly reporting the work of field investigators. A small subset (less than 20 percent) of VS30 values was previously compiled by the USGS and other research institutions. Whenever possible, VS30 originating from these earlier compilations were crosschecked against published reports. Both downhole and surface-based VS30 estimates are represented in our VS30 compilation. Most of the VS30 data are for sites in the western contiguous United States (2,141 sites), whereas 786 VS30 values are for sites in the Central and Eastern United States; 70 values are for sites in other parts of the United States, including Alaska (15 sites), Hawaii (30 sites), and Puerto Rico (25 sites). An interactive map is hosted on the primary USGS Web site for accessing VS30 data (https://earthquake.usgs.gov/data/vs30/us/).
Shear-wave velocity (Vs) offers a means to determine the seismic resistance of soil to liquefaction by a fundamental soil property. This paper presents the results of an 11-year international project to gather new Vs site data and develop probabilistic correlations for seismic soil liquefaction occurrence. Toward that objective, shear-wave velocity test sites were identified, and measurements made for 301 new liquefaction field case histories in China, Japan, Taiwan, Greece, and the United States over a decade. The majority of these new case histories reoccupy those previously investigated by penetration testing. These new data are combined with previously published case histories to build a global catalog of 422 case histories of Vs liquefaction performance. Bayesian regression and structural reliability methods facilitate a probabilistic treatment of the Vs catalog for performance-based engineering applications. Where possible, uncertainties of the variables comprising both the seismic demand and the soil capacity were estimated and included in the analysis, resulting in greatly reduced overall model uncertainty relative to previous studies. The presented data set and probabilistic analysis also help resolve the ancillary issues of adjustment for soil fines content and magnitude scaling factors.
Abstract Because the amount of available ground-motion data has increased over the last decades, the need for automated processing algorithms has also increased. One difficulty with automated processing is to screen clipped records. Clipping occurs when the ground-motion amplitude exceeds the dynamic range of the linear response of the instrument. Clipped records in which the amplitude exceeds the dynamic range are relatively easy to identify visually yet challenging for automated algorithms. In this article, we seek to identify a reliable and fully automated clipping detection algorithm tailored to near-real-time earthquake response needs. We consider multiple alternative algorithms, including (1) an algorithm based on the percentage difference in adjacent data points, (2) the standard deviation of the data within a moving window, (3) the shape of the histogram of the recorded amplitudes, (4) the second derivative of the data, and (5) the amplitude of the data. To quantitatively compare these algorithms, we construct development and holdout datasets from earthquakes across a range of geographic regions, tectonic environments, and instrument types. We manually classify each record for the presence of clipping and use the classified records. We then develop an artificial neural network model that combines all the individual algorithms. Testing on the holdout dataset, the standard deviation and histogram approaches are the most accurate individual algorithms, with an overall accuracy of about 93%. The combined artificial neural network method yields an overall accuracy of 95%, and the choice of classification threshold can balance precision and recall.
Abstract In the minutes to hours after a major earthquake, such as the recent 2018 Mw 7.1 Anchorage event, the U.S. Geological Survey (USGS) produces a suite of interconnected earthquake products that provides diverse information ranging from basic earthquake source parameters to loss estimates. The 2018 Anchorage earthquake is the first major domestic earthquake to occur since several new USGS products have been developed, thus providing an opportunity to discuss the newly expanded USGS earthquake product suite, its timeliness, performance, and reception. Overall, the products were relatively timely, accurate, well received, and widely used, including by the media, who used information and visualizations from many products to frame their early reporting. One downside of the codependence of multiple products is that reasonable updates to upstream products (e.g., magnitude and source characterization) can result in significant changes to downstream products; this was the case for the Anchorage earthquake. However, the coverage of strong‐motion stations and felt reports was so dense that the ShakeMap and downstream products were relatively insensitive to changes in magnitude or fault‐plane orientation once the ground‐motion data were available. Shaking and loss indicators initially fluctuated in the first hour or two after the earthquake, but they stabilized quickly. To understand how the products are being used and how effectively they are being communicated, we analyze the media coverage of USGS earthquake products. Most references to USGS products occurred within the first 48 hr after the event. The lack of coverage after 48 hr could indicate that longer‐term products addressing what actions the USGS is taking or what early reconnaissance has revealed might be useful for those people wanting additional information about the earthquake.
Abstract We compare estimates of the empirical transfer function (ETF) to the plane SH -wave theoretical transfer function (TTF) within a laterally constant medium for invasive and noninvasive estimates of the seismic shear-wave slownesses at 13 Kiban-Kyoshin network stations throughout Japan. The difference between the ETF and either of the TTFs is substantially larger than the difference between the two TTFs computed from different estimates of the seismic properties. We show that the plane SH -wave TTF through a laterally homogeneous medium at vertical incidence inadequately models observed amplifications at most sites for both slowness estimates, obtained via downhole measurements and the spectral analysis of surface waves. Strategies to improve the predictions can be separated into two broad categories: improving the measurement of soil properties and improving the theory that maps the 1D soil profile onto spectral amplification. Using an example site where the 1D plane SH -wave formulation poorly predicts the ETF, we find a more satisfactory fit to the ETF by modeling the full wavefield and incorporating spatially correlated variability of the seismic properties. We conclude that our ability to model the observed site response transfer function is limited largely by the assumptions of the theoretical formulation rather than the uncertainty of the soil property estimates.