Statistics and Pattern Recognition Applied to the Spatio-Temporal Properties of Seismicity

A special issue of Applied Sciences (ISSN 2076-3417). This special issue belongs to the section "Earth Sciences".

Deadline for manuscript submissions: closed (31 December 2021) | Viewed by 17834

Printed Edition Available!
A printed edition of this Special Issue is available here.

Special Issue Editors


E-Mail Website
Guest Editor
National Institute of Oceanography and Applied Geophysics - OGS
Interests: statistical seismology; source parameters; seismic catalogues; artificial intelligence
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
National Institute of Geophysics and Volcanology (INGV)
Interests: statistical seismology; earthquake physics; source parameters; seismic catalogues

E-Mail Website
Guest Editor
Department of Earth Sciences and Department of Physics and Astronomy, Western University, London, ON, Canada
Interests: statistical seismology; earthquake physics; aftershocks; nonlinear geophysics; continuum mechanics; geomechanics

E-Mail Website1 Website2
Guest Editor
Section of Geophysics-Geothermy, Department of Geology and Geoenvironment, National and Kapodistrian University of Athens, Zografou, 15784 Athens, Greece
Interests: geophysics; earth physics; seismology; applied geophysics
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

In recent years, there has been significant progress in understanding scaling laws, spatiotemporal correlations, and clustering of earthquakes, with direct implications for time-dependent seismic hazard assessment. New models based on seismicity patterns, considering their physical meaning and their statistical significance, have shed light on the preparation process before large earthquakes and on the evolution of clustered seismicity in time and space. On the other hand, increasing amounts of seismic data available on both local and global scales, together with accurate assessments of the reliability of catalogues, offer new opportunities for model testing.

This Special Issue focuses on emerging methods to improve our understanding of the physical processes responsible for the occurrence of earthquakes in space and time, on new models, techniques, and tools for quantifying both the seismotectonic processes and their evolution. It also focuses on new approaches and procedures to improve the analysis and processing of seismic catalogues including machine learning techniques.

Dr. Stefania Gentili
Dr. Rita Di Giovambattista
Dr. Robert Shcherbakov
Prof. Filippos Vallianatos
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Applied Sciences is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Physical and statistical models of earthquake occurrence
  • Earthquake clustering
  • Quantitative testing
  • Earthquake catalogues
  • Time-dependent hazard
  • Earthquake forecasting
  • Model testing
  • Pattern recognition in seismology
  • Machine learning applied to seismic data
  • Uncertainty quantification methods

Published Papers (9 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research

2 pages, 167 KiB  
Editorial
Editorial of the Special Issue “Statistics and Pattern Recognition Applied to the Spatio-Temporal Properties of Seismicity”
by Stefania Gentili, Rita Di Giovambattista, Robert Shcherbakov and Filippos Vallianatos
Appl. Sci. 2022, 12(9), 4504; https://0-doi-org.brum.beds.ac.uk/10.3390/app12094504 - 29 Apr 2022
Cited by 1 | Viewed by 839
Abstract
Due to the significant increase in the availability of new data in recent years, as a result of the expansion of available seismic stations, laboratory experiments, and the availability of increasingly reliable synthetic catalogs, considerable progress has been made in understanding the spatiotemporal [...] Read more.
Due to the significant increase in the availability of new data in recent years, as a result of the expansion of available seismic stations, laboratory experiments, and the availability of increasingly reliable synthetic catalogs, considerable progress has been made in understanding the spatiotemporal properties of earthquakes [...] Full article

Research

Jump to: Editorial

14 pages, 2222 KiB  
Article
Physics-Based Simulation of Sequences with Foreshocks, Aftershocks and Multiple Main Shocks in Italy
by Rodolfo Console, Paola Vannoli and Roberto Carluccio
Appl. Sci. 2022, 12(4), 2062; https://0-doi-org.brum.beds.ac.uk/10.3390/app12042062 - 16 Feb 2022
Cited by 4 | Viewed by 1874
Abstract
We applied a new version of physics-based earthquake simulator upon a seismogenic model of the Italian seismicity derived from the latest version of the Database of Individual Seismogenic Sources (DISS). We elaborated appropriately for their use within the simulator all fault systems identified [...] Read more.
We applied a new version of physics-based earthquake simulator upon a seismogenic model of the Italian seismicity derived from the latest version of the Database of Individual Seismogenic Sources (DISS). We elaborated appropriately for their use within the simulator all fault systems identified in the study area. We obtained synthetic catalogs spanning hundreds of thousands of years. The resulting synthetic seismic catalogs exhibit typical magnitude, space and time features that are comparable to those obtained by real observations. A typical aspect of the observed seismicity is the occurrence of earthquake sequences characterized by multiple main shocks of similar magnitude. Special attention was devoted to verifying whether the simulated catalogs include this notable aspect, by the use of an especially developed computer code. We found that the phenomenon of Coulomb stress transfer from causative to receiving source patches during an earthquake rupture has a critical role in the behavior of seismicity patterns in the simulated catalogs. We applied the simulator to the seismicity of the northern and central Apennines and compared the resulting synthetic catalog with the observed seismicity for the period 1650–2020. The result of this comparison supports the hypothesis that the occurrence of sequences containing multiple mainshocks is not just a casual circumstance. Full article
Show Figures

Figure 1

31 pages, 32674 KiB  
Article
Identification and Temporal Characteristics of Earthquake Clusters in Selected Areas in Greece
by Polyzois Bountzis, Eleftheria Papadimitriou and George Tsaklidis
Appl. Sci. 2022, 12(4), 1908; https://0-doi-org.brum.beds.ac.uk/10.3390/app12041908 - 11 Feb 2022
Cited by 7 | Viewed by 1970
Abstract
The efficiency of earthquake clustering investigation is improved as we gain access to larger datasets due to the increase of earthquake detectability. We aim to demonstrate the robustness of a new clustering method, MAP-DBSCAN, and to present a comprehensive analysis of the clustering [...] Read more.
The efficiency of earthquake clustering investigation is improved as we gain access to larger datasets due to the increase of earthquake detectability. We aim to demonstrate the robustness of a new clustering method, MAP-DBSCAN, and to present a comprehensive analysis of the clustering properties in three major seismic zones of Greece during 2012–2019. A time-dependent stochastic point model, the Markovian Arrival Process (MAP), is implemented for the detection of change-points in the seismicity rate and subsequently, a density-based clustering algorithm, DBSCAN, is used for grouping the events into spatiotemporal clusters. The two-step clustering procedure, MAP-DBSCAN, is compared with other existing methods (Gardner-Knopoff, Reasenberg, Nearest-Neighbor) on a simulated earthquake catalog and is proven highly competitive as in most cases outperforms the tested algorithms. Next, the earthquake clusters in the three areas are detected and the regional variability of their productivity rates is investigated based on the generic estimates of the Epidemic Type Aftershock Sequence (ETAS) model. The seismicity in the seismic zone of Corinth Gulf is characterized by low aftershock productivity and high background rates, indicating the dominance of swarm activity, whereas in Central Ionian Islands seismic zone where main shock-aftershock sequences dominate, the aftershock productivity rates are higher. The productivity in the seismic zone of North Aegean Sea vary significantly among clusters probably due to the co-existence of swarm activity and aftershock sequences. We believe that incorporating regional variations of the productivity into forecasting models, such as the ETAS model, it might improve operational earthquake forecasting. Full article
Show Figures

Figure 1

34 pages, 4809 KiB  
Article
The Analysis of the Aftershock Sequences of the Recent Mainshocks in Alaska
by Mohammadamin Sedghizadeh and Robert Shcherbakov
Appl. Sci. 2022, 12(4), 1809; https://0-doi-org.brum.beds.ac.uk/10.3390/app12041809 - 10 Feb 2022
Cited by 2 | Viewed by 1805
Abstract
The forecasting of the evolution of natural hazards is an important and critical problem in natural sciences and engineering. Earthquake forecasting is one such example and is a difficult task due to the complexity of the occurrence of earthquakes. Since earthquake forecasting is [...] Read more.
The forecasting of the evolution of natural hazards is an important and critical problem in natural sciences and engineering. Earthquake forecasting is one such example and is a difficult task due to the complexity of the occurrence of earthquakes. Since earthquake forecasting is typically based on the seismic history of a given region, the analysis of the past seismicity plays a critical role in modern statistical seismology. In this respect, the recent three significant mainshocks that occurred in Alaska (the 2002, Mw 7.9 Denali; the 2018, Mw 7.9 Kodiak; and the 2018, Mw 7.1 Anchorage earthquakes) presented an opportunity to analyze these sequences in detail. This included the modelling of the frequency-magnitude statistics of the corresponding aftershock sequences. In addition, the aftershock occurrence rates were modelled using the Omori–Utsu (OU) law and the Epidemic Type Aftershock Sequence (ETAS) model. For each sequence, the calculation of the probability to have the largest expected aftershock during a given forecasting time interval was performed using both the extreme value theory and the Bayesian predictive framework. For the Bayesian approach, the Markov Chain Monte Carlo (MCMC) sampling of the posterior distribution was performed to generate the chains of the model parameters. These MCMC chains were used to simulate the models forward in time to compute the predictive distributions. The calculation of the probabilities to have the largest expected aftershock to be above a certain magnitude after a mainshock using the Bayesian predictive framework fully takes into account the uncertainties of the model parameters. Moreover, in order to investigate the credibility of the obtained forecasts, several statistical tests were conducted to compare the performance of the earthquake rate models based on the OU formula and the ETAS model. The results indicate that the Bayesian approach combined with the ETAS model produced more robust results than the standard approach based on the extreme value distribution and the OU law. Full article
Show Figures

Figure 1

13 pages, 2989 KiB  
Article
Estimation of the Tapered Gutenberg-Richter Distribution Parameters for Catalogs with Variable Completeness: An Application to the Atlantic Ridge Seismicity
by Matteo Taroni, Jacopo Selva and Jiancang Zhuang
Appl. Sci. 2021, 11(24), 12166; https://0-doi-org.brum.beds.ac.uk/10.3390/app112412166 - 20 Dec 2021
Cited by 4 | Viewed by 2777
Abstract
The use of the tapered Gutenberg-Richter distribution in earthquake source models is rapidly increasing, allowing overcoming the definition of a hard threshold for the maximum magnitude. Here, we expand the classical maximum likelihood estimation method for estimating the parameters of the tapered Gutenberg-Richter [...] Read more.
The use of the tapered Gutenberg-Richter distribution in earthquake source models is rapidly increasing, allowing overcoming the definition of a hard threshold for the maximum magnitude. Here, we expand the classical maximum likelihood estimation method for estimating the parameters of the tapered Gutenberg-Richter distribution, allowing the use of a variable through-time magnitude of completeness. Adopting a well-established technique based on asymptotic theory, we also estimate the uncertainties relative to the parameters. Differently from other estimation methods for catalogs with a variable completeness, available for example for the classical truncated Gutenberg-Richter distribution, our approach does not need the assumption on the distribution of the number of events (usually the Poisson distribution). We test the methodology checking the consistency of parameter estimations with synthetic catalogs generated with multiple completeness levels. Then, we analyze the Atlantic ridge seismicity, using the global centroid moment tensor catalog, finding that our method allows better constraining distribution parameters, allowing the use more data than estimations based on a single completeness level. This leads to a sharp decrease in the uncertainties associated with the parameter estimation, when compared with existing methods based on a single time-independent magnitude of completeness. This also allows analyzing subsets of events, to deepen data analysis. For example, separating normal and strike-slip events, we found that they have significantly different but well-constrained corner magnitudes. Instead, without distinguishing for focal mechanism and considering all the events in the catalog, we obtain an intermediate value that is relatively less constrained from data, with an open confidence region. Full article
Show Figures

Figure 1

12 pages, 2433 KiB  
Article
A New Smoothed Seismicity Approach to Include Aftershocks and Foreshocks in Spatial Earthquake Forecasting: Application to the Global Mw ≥ 5.5 Seismicity
by Matteo Taroni and Aybige Akinci
Appl. Sci. 2021, 11(22), 10899; https://0-doi-org.brum.beds.ac.uk/10.3390/app112210899 - 18 Nov 2021
Cited by 7 | Viewed by 1496
Abstract
Seismicity-based earthquake forecasting models have been primarily studied and developed over the past twenty years. These models mainly rely on seismicity catalogs as their data source and provide forecasts in time, space, and magnitude in a quantifiable manner. In this study, we presented [...] Read more.
Seismicity-based earthquake forecasting models have been primarily studied and developed over the past twenty years. These models mainly rely on seismicity catalogs as their data source and provide forecasts in time, space, and magnitude in a quantifiable manner. In this study, we presented a technique to better determine future earthquakes in space based on spatially smoothed seismicity. The improvement’s main objective is to use foreshock and aftershock events together with their mainshocks. Time-independent earthquake forecast models are often developed using declustered catalogs, where smaller-magnitude events regarding their mainshocks are removed from the catalog. Declustered catalogs are required in the probabilistic seismic hazard analysis (PSHA) to hold the Poisson assumption that the events are independent in time and space. However, as highlighted and presented by many recent studies, removing such events from seismic catalogs may lead to underestimating seismicity rates and, consequently, the final seismic hazard in terms of ground shaking. Our study also demonstrated that considering the complete catalog may improve future earthquakes’ spatial forecast. To do so, we adopted two different smoothed seismicity methods: (1) the fixed smoothing method, which uses spatially uniform smoothing parameters, and (2) the adaptive smoothing method, which relates an individual smoothing distance for each earthquake. The smoothed seismicity models are constructed by using the global earthquake catalog with Mw ≥ 5.5 events. We reported progress on comparing smoothed seismicity models developed by calculating and evaluating the joint log-likelihoods. Our resulting forecast shows a significant information gain concerning both fixed and adaptive smoothing model forecasts. Our findings indicate that complete catalogs are a notable feature for increasing the spatial variation skill of seismicity forecasts. Full article
Show Figures

Figure 1

13 pages, 4457 KiB  
Article
Space–Time Trade-Off of Precursory Seismicity in New Zealand and California Revealed by a Medium-Term Earthquake Forecasting Model
by Sepideh J. Rastin, David A. Rhoades and Annemarie Christophersen
Appl. Sci. 2021, 11(21), 10215; https://0-doi-org.brum.beds.ac.uk/10.3390/app112110215 - 31 Oct 2021
Cited by 5 | Viewed by 1515
Abstract
The ‘Every Earthquake a Precursor According to Scale’ (EEPAS) medium-term earthquake forecasting model is based on the precursory scale increase (Ψ) phenomenon and associated scaling relations, in which the precursor magnitude MP is predictive of the mainshock magnitude Mm, precursor [...] Read more.
The ‘Every Earthquake a Precursor According to Scale’ (EEPAS) medium-term earthquake forecasting model is based on the precursory scale increase (Ψ) phenomenon and associated scaling relations, in which the precursor magnitude MP is predictive of the mainshock magnitude Mm, precursor time TP and precursory area AP. In early studies of Ψ, a relatively low correlation between TP and AP suggested the possibility of a trade-off between time and area as a second-order effect. Here, we investigate the trade-off by means of the EEPAS model. Existing versions of EEPAS in New Zealand and California forecast target earthquakes of magnitudes M > 4.95 from input catalogues with M > 2.95. We systematically vary one parameter each from the EEPAS distributions for time and location, thereby varying the temporal and spatial scales of these distributions by two orders of magnitude. As one of these parameters is varied, the other is refitted to a 20-year period of each catalogue. The resulting curves of the temporal scaling factor against the spatial scaling factor are consistent with an even trade-off between time and area, given the limited temporal and spatial extent of the input catalogue. Hybrid models are formed by mixing several EEPAS models, with parameter sets chosen from points on the trade-off line. These are tested against the original fitted EEPAS models on a subsequent period of the New Zealand catalogue. The resulting information gains suggest that the space–time trade-off can be exploited to improve forecasting. Full article
Show Figures

Figure 1

17 pages, 13767 KiB  
Article
Different Fault Response to Stress during the Seismic Cycle
by Davide Zaccagnino, Luciano Telesca and Carlo Doglioni
Appl. Sci. 2021, 11(20), 9596; https://0-doi-org.brum.beds.ac.uk/10.3390/app11209596 - 14 Oct 2021
Cited by 6 | Viewed by 1553
Abstract
Seismic prediction was considered impossible, however, there are no reasons in theoretical physics that explicitly prevent this possibility. Therefore, it is quite likely that prediction is made stubbornly complicated by practical difficulties such as the quality of catalogs and data analysis. Earthquakes are [...] Read more.
Seismic prediction was considered impossible, however, there are no reasons in theoretical physics that explicitly prevent this possibility. Therefore, it is quite likely that prediction is made stubbornly complicated by practical difficulties such as the quality of catalogs and data analysis. Earthquakes are sometimes forewarned by precursors, and other times they come unexpectedly; moreover, since no unique mechanism for nucleation was proven to exist, it is unlikely that single classical precursors (e.g., increasing seismicity, geochemical anomalies, geoelectric potentials) may ever be effective in predicting impending earthquakes. For this reason, understanding the physics driving the evolution of fault systems is a crucial task to fine-tune seismic prediction methods and for the mitigation of seismic risk. In this work, an innovative idea is inspected to establish the proximity to the critical breaking point. It is based on the mechanical response of faults to tidal perturbations, which is observed to change during the “seismic cycle”. This technique allows to identify different seismic patterns marking the fingerprints of progressive crustal weakening. Destabilization seems to arise from two different possible mechanisms compatible with the so called preslip patch, cascade models and with seismic quiescence. The first is featured by a decreasing susceptibility to stress perturbation, anomalous geodetic deformation, and seismic activity, while on the other hand, the second shows seismic quiescence and increasing responsiveness. The novelty of this article consists in highlighting not only the variations in responsiveness of faults to stress while reaching the critical point, but also how seismic occurrence changes over time as a function of instability. Temporal swings of correlation between tides and nucleated seismic energy reveal a complex mechanism for modulation of energy dissipation driven by stress variations, above all in the upper brittle crust. Some case studies taken from recent Greek seismicity are investigated. Full article
Show Figures

Figure 1

30 pages, 10727 KiB  
Article
System-Analytical Method of Earthquake-Prone Areas Recognition
by Boris A. Dzeboev, Alexei D. Gvishiani, Sergey M. Agayan, Ivan O. Belov, Jon K. Karapetyan, Boris V. Dzeranov and Yuliya V. Barykina
Appl. Sci. 2021, 11(17), 7972; https://0-doi-org.brum.beds.ac.uk/10.3390/app11177972 - 28 Aug 2021
Cited by 10 | Viewed by 2365
Abstract
Typically, strong earthquakes do not occur over the entire territory of the seismically active region. Recognition of areas where they may occur is a critical step in seismic hazard assessment studies. For half a century, the Earthquake-Prone Areas (EPA) approach, developed by the [...] Read more.
Typically, strong earthquakes do not occur over the entire territory of the seismically active region. Recognition of areas where they may occur is a critical step in seismic hazard assessment studies. For half a century, the Earthquake-Prone Areas (EPA) approach, developed by the famous Soviet academicians I.M. Gelfand and V.I. Keilis-Borok, was used to recognize areas prone to strong earthquakes. For the modern development of ideas that form the basis of the EPA method, new mathematical methods of pattern recognition are proposed. They were developed by the authors to overcome the difficulties that arise today when using the EPA approach in its classic version. So, firstly, a scheme for the recognition of high seismicity disjunctive nodes and the vicinities of axis intersections of the morphostructural lineaments was created with only one high seismicity learning class. Secondly, the system-analytical method FCAZ (Formalized Clustering and Zoning) has been developed. It uses the epicenters of fairly weak earthquakes as recognition objects. This makes it possible to develop the recognition result of areas prone to strong earthquakes after the appearance of epicenters of new weak earthquakes and, thereby, to repeatedly correct the results over time. It is shown that the creation of the FCAZ method for the first time made it possible to consider the classical problem of earthquake-prone areas recognition from the point of view of advanced systems analysis. The new mathematical recognition methods proposed in the article have made it possible to successfully identify earthquake-prone areas on the continents of North and South America, Eurasia, and in the subduction zones of the Pacific Rim. Full article
Show Figures

Figure 1

Back to TopTop