Por favor, use este identificador para citar o enlazar este ítem: http://repositorio.udec.cl/jspui/handle/11594/10096
Título : Implementation of a numerical methodology for the stochastic characterization of the Valdivia 1960 9.5 MW tsunami source.
Autor : Calisto Burgos, María Ignacia; supervisora de grado
Cifuentes Lobos, Rodrigo Ignacio
Palabras clave : Terremotos;Chile;1960;Modelos Matemáticos;Análisis de Riesgo de Terremotos;Modelos Matemáticos;Maremotos;Modelos Matemáticos;Zonas de Subducción;Terremotos;1960;Modelos Matemáticos;Análisis de Riesgo de Terremotos;Modelos Matemáticos;Maremotos;Modelos Matemáticos;Zonas de Subducción;Chile
Fecha de publicación : 2022
Editorial : Universidad de Concepción.
Resumen : Probabilistic Tsunami Hazard Assessment (PTHA) brings a variety of mathematical and numerical tools for evaluating long-term exposure to tsunami related hazards in coastal communities, within which the logic tree method stands out for its usefulness and versatility in generating random slip models and dealing with epistemic and aleatory uncertainties, key items for the stochastic study of future tsunami scenarios. This method, by combining parameters that define a source model (such as magnitude, and rupture limits), allows for the creation of a vast number of random source models that can be used for assessing future and long-term hazard. They can also be used in conjunction with data and observations obtained from past tsunamis and earthquakes to open new possibilities for studying past tsunami, and their seismic source models. This study proposes a numerical methodology for the generation of random tsunami source models, based on the aforementioned logic tree method, for studying past tsunamis and historical tsunamis. In this case this methodology will be tested with compiled data from the great Valdivia 1960 9.5 Mw earthquake and tsunami. This methodology works by filtering the random source models produced by using the logic tree methodology in a staggered fashion. Firstly, they are filtered with empirical relations between magnitudes and rupture dimensions or rupture aspect ratios. The remaining models are then used to compute vertical seafloor deformation using the Okada (1985) solution. These deformation fields are then compared with geodetic data and observations associated with the event of interest, in this case the Valdivia 1960 earthquake, eliminating all models that do not satisfy these observations. In contrast, all models that do pass this filter, are used as inputs to model tsunami using a staggered scheme, first modelling with low resolution topobathymetry grids, in order to assess if tsunami waves are registered in locations that are known to have been inundated and eliminate the models that do not show this behavior. For those that fulfil the low-resolution modeling, high resolution grids are used to model tsunami and appraise the estimated run up of inundations and compare them with reliable historical accounts and sedimentological observations. The models that pass all filters mentioned above, will be subjected to statistical analysis, such as conditional probability analysis of slip amounts according to its location or the analysis of the cumulative density functions of the slip, to compare them with existent, published, models of the Valdivia 1960 earthquake. In order to appraise the convergence of the random models generated using a logic tree approach that pass every filter to the existent source models, the Valdivia 1960 9.5 Mw event will be used as a benchmark to test this methodology, due to the number of published studies, data available, reliable historical accounts and source models computed with different techniques and from different data sets, such as geodetic, seismic or tsunami recordings. It is of the utmost importance to further specify that this methodology was designed, and is intended to be used, to study historical tsunami, and will only be tested with modern tsunamis because of the availability of data and studies, such is the case of the Valdivia 1960 earthquake. The hypothesis proposed in this study is that the estimation of the most likely tsunami source (slip distribution and rupture geometry and limits) of the Valdivia 1960, procured through the analysis of random displacement models obtained using a logic tree structure will be a solution that satisfy the available geodetic, deformation and tsunami data, observations and historical accounts. This work is subdivided into two parts, an initial resolution test with synthetic deformation, wave arrival and inundation data to test the capabilities and the response of the method to different types of data availability, such as differences in data density (a large number of data points versus sparse points), distribution (uniform distribution along the territory versus clusters of data or just one cluster) or differences in availability of deformation or tsunami data. Once this test is performed, the methodology will be applied to the case of the Valdivia earthquake.
Descripción : Thesis for the degree of Master of Sciences in Geophysics.
URI : http://repositorio.udec.cl/jspui/handle/11594/10096
Aparece en las colecciones: Ciencias Físicas y Matemáticas - Tesis Magister

Ficheros en este ítem:
Fichero Descripción Tamaño Formato  
Tesis Rodrigo Cifuentes.Image.Marked.pdf14,65 MBAdobe PDFVista previa
Visualizar/Abrir


Este ítem está sujeto a una licencia Creative Commons Licencia Creative Commons Creative Commons