The Geology and Landscapes of Scotland
In: Scottish affairs, Band 26, Heft 2, S. 266-267
ISSN: 2053-888X
5 Ergebnisse
Sortierung:
In: Scottish affairs, Band 26, Heft 2, S. 266-267
ISSN: 2053-888X
In: Scottish affairs, Band 65 (First Serie, Heft 1, S. 147-150
ISSN: 2053-888X
In: Natural hazards and earth system sciences: NHESS, Band 22, Heft 10, S. 3231-3246
ISSN: 1684-9981
Abstract. Probabilistic earthquake forecasts estimate the likelihood of future earthquakes within a specified time-space-magnitude window and are important because they inform planning of hazard mitigation activities on different time scales. The spatial component of such forecasts, expressed as seismicity models, generally relies upon some combination of past event locations and underlying factors which might affect spatial intensity, such as strain rate, fault location and slip rate or past seismicity. For the first time, we extend previously reported spatial seismicity models, generated using the open source inlabru package, to time-independent earthquake forecasts using California as a case study. The inlabru approach allows the rapid evaluation of point process models which integrate different spatial datasets. We explore how well various candidate forecasts perform compared to observed activity over three contiguous 5-year time periods using the same training window for the input seismicity data. In each case we compare models constructed from both full and declustered earthquake catalogues. In doing this, we compare the use of synthetic catalogue forecasts to the more widely used grid-based approach of previous forecast testing experiments. The simulated catalogue approach uses the full model posteriors to create Bayesian earthquake forecasts, not just the mean. We show that simulated catalogue based forecasts perform better than the grid-based equivalents due to (a) their ability to capture more uncertainty in the model components and (b) the associated relaxation of the Poisson assumption in testing. We demonstrate that the inlabru models perform well overall over various time periods: The full catalogue models perform favourably in the first testing period (2006–2011) while the declustered catalogue models perform better in the 2011–2016 testing period, with both sets of models performing less well in the most recent (2016–2021) testing period. Together, these findings demonstrate a significant improvement in earthquake forecasting is possible although this has yet to be tested and proven in true prospective mode.
Recent developments in earthquake forecasting models have demonstrated the need for a robust method for identifying which model components are most beneficial to understanding spatial patterns of seismicity. Borrowing from ecology, we use Log‐Gaussian Cox process models to describe the spatially varying intensity of earthquake locations. These models are constructed using elements which may influence earthquake locations, including the underlying fault map and past seismicity models, and a random field to account for any excess spatial variation that cannot be explained by deterministic model components. Comparing the alternative models allows the assessment of the performance of models of varying complexity composed of different components and therefore identifies which elements are most useful for describing the distribution of earthquake locations. We demonstrate the effectiveness of this approach using synthetic data and by making use of the earthquake and fault information available for California, including an application to the 2019 Ridgecrest sequence. We show the flexibility of this modeling approach and how it might be applied in areas where we do not have the same abundance of detailed information. We find results consistent with existing literature on the performance of past seismicity models that slip rates are beneficial for describing the spatial locations of larger magnitude events and that strain rate maps can constrain the spatial limits of seismicity in California. We also demonstrate that maps of distance to the nearest fault can benefit spatial models of seismicity, even those that also include the primary fault geometry used to construct them. ; K. B. was funded during this work by an EPSRC PhD studentship (Grant 1519006) and during the writing of this paper by NERC‐NSF grant NE/R000794/1 and by the Real‐time Earthquake Risk Reduction for a Resilient Europe "RISE" project, which has received funding from the European Union's Horizon 2020 research and innovation program under grant Agreement ...
BASE
Following the 2009 L'Aquila earthquake, the Dipartimento della Protezione Civile Italiana (DPC), appointed an International Commission on Earthquake Forecasting for Civil Protection (ICEF) to report on the current state of knowledge of short-term prediction and forecasting of tectonic earthquakes and indicate guidelines for utilization of possible forerunners of large earthquakes to drive civil protection actions, including the use of probabilistic seismic hazard analysis in the wake of a large earthquake. The ICEF reviewed research on earthquake prediction and forecasting, drawing from developments in seismically active regions worldwide. A prediction is defined as a deterministic statement that a future earthquake will or will not occur in a particular geographic region, time window, and magnitude range, whereas a forecast gives a probability (greater than zero but less than one) that such an event will occur. Earthquake predictability, the degree to which the future occurrence of earthquakes can be determined from the observable behavior of earthquake systems, is poorly understood. This lack of understanding is reflected in the inability to reliably predict large earthquakes in seismically active regions on short time scales. Most proposed prediction methods rely on the concept of a diagnostic precursor; i.e., some kind of signal observable before earthquakes that indicates with high probability the location, time, and magnitude of an impending event. Precursor methods reviewed here include changes in strain rates, seismic wave speeds, and electrical conductivity; variations of radon concentrations in groundwater, soil, and air; fluctuations in groundwater levels; electromagnetic variations near and above Earth's surface; thermal anomalies; anomalous animal behavior; and seismicity patterns. The search for diagnostic precursors has not yet produced a successful short-term prediction scheme. Therefore, this report focuses on operational earthquake forecasting as the principle means for gathering and disseminating authoritative information about time-dependent seismic hazards to help communities prepare for potentially destructive earthquakes. On short time scales of days and weeks, earthquake sequences show clustering in space and time, as indicated by the aftershocks triggered by large events. Statistical descriptions of clustering explain many features observed in seismicity catalogs, and they can be used to construct forecasts that indicate how earthquake probabilities change over the short term. Properly applied, short-term forecasts have operational utility; for example, in anticipating aftershocks that follow large earthquakes. Although the value of long-term forecasts for ensuring seismic safety is clear, the interpretation of short-term forecasts is problematic, because earthquake probabilities may vary over orders of magnitude but typically remain low in an absolute sense (< 1% per day). Translating such low-probability forecasts into effective decision-making is a difficult challenge. Reports on the current utilization operational forecasting in earthquake risk management were compiled for six countries with high seismic risk: China, Greece, Italy, Japan, Russia, United States. Long-term models are currently the most important forecasting tools for civil protection against earthquake damage, because they guide earthquake safety provisions of building codes, performance-based seismic design, and other risk-reducing engineering practices, such as retrofitting to correct design flaws in older buildings. Short-term forecasting of aftershocks is practiced by several countries among those surveyed, but operational earthquake forecasting has not been fully implemented (i.e., regularly updated and on a national scale) in any of them. Based on the experience accumulated in seismically active regions, the ICEF has provided to DPC a set of recommendations on the utilization of operational forecasting in Italy, which may also be useful in other countries. The public should be provided with open sources of information about the short-term probabilities of future earthquakes that are authoritative, scientific, consistent, and timely. Advisories should be based on operationally qualified, regularly updated seismicity forecasting systems that have been rigorously reviewed and updated by experts in the creation, delivery, and utility of earthquake information. The quality of all operational models should be evaluated for reliability and skill by retrospective testing, and they should be under continuous prospective testing against established long-term forecasts and alternative time-dependent models. Alert procedures should be standardized to facilitate decisions at different levels of government and among the public. Earthquake probability thresholds should be established to guide alert levels based on objective analysis of costs and benefits, as well as the less tangible aspects of value-of-information, such as gains in psychological preparedness and resilience. The principles of effective public communication established by social science research should be applied to the delivery of seismic hazard information.
BASE