Implementation of a numerical methodology for the stochastic characterization of the Valdivia 1960 9.5 MW tsunami source.
Loading...
Date
2022
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Universidad de Concepción.
Abstract
Probabilistic Tsunami Hazard Assessment (PTHA) brings a variety of mathematical and numerical
tools for evaluating long-term exposure to tsunami related hazards in coastal communities, within
which the logic tree method stands out for its usefulness and versatility in generating random slip
models and dealing with epistemic and aleatory uncertainties, key items for the stochastic study of
future tsunami scenarios. This method, by combining parameters that define a source model (such
as magnitude, and rupture limits), allows for the creation of a vast number of random source models
that can be used for assessing future and long-term hazard. They can also be used in conjunction
with data and observations obtained from past tsunamis and earthquakes to open new possibilities
for studying past tsunami, and their seismic source models.
This study proposes a numerical methodology for the generation of random tsunami source
models, based on the aforementioned logic tree method, for studying past tsunamis and historical
tsunamis. In this case this methodology will be tested with compiled data from the great Valdivia
1960 9.5 Mw earthquake and tsunami. This methodology works by filtering the random source
models produced by using the logic tree methodology in a staggered fashion. Firstly, they are filtered
with empirical relations between magnitudes and rupture dimensions or rupture aspect ratios.
The remaining models are then used to compute vertical seafloor deformation using the Okada
(1985) solution. These deformation fields are then compared with geodetic data and observations
associated with the event of interest, in this case the Valdivia 1960 earthquake, eliminating all
models that do not satisfy these observations. In contrast, all models that do pass this filter, are
used as inputs to model tsunami using a staggered scheme, first modelling with low resolution
topobathymetry grids, in order to assess if tsunami waves are registered in locations that are
known to have been inundated and eliminate the models that do not show this behavior. For
those that fulfil the low-resolution modeling, high resolution grids are used to model tsunami and
appraise the estimated run up of inundations and compare them with reliable historical accounts and
sedimentological observations. The models that pass all filters mentioned above, will be subjected
to statistical analysis, such as conditional probability analysis of slip amounts according to its
location or the analysis of the cumulative density functions of the slip, to compare them with
existent, published, models of the Valdivia 1960 earthquake.
In order to appraise the convergence of the random models generated using a logic tree approach
that pass every filter to the existent source models, the Valdivia 1960 9.5 Mw event will be used
as a benchmark to test this methodology, due to the number of published studies, data available,
reliable historical accounts and source models computed with different techniques and from different
data sets, such as geodetic, seismic or tsunami recordings. It is of the utmost importance to further
specify that this methodology was designed, and is intended to be used, to study historical tsunami,
and will only be tested with modern tsunamis because of the availability of data and studies, such
is the case of the Valdivia 1960 earthquake.
The hypothesis proposed in this study is that the estimation of the most likely tsunami source
(slip distribution and rupture geometry and limits) of the Valdivia 1960, procured through the
analysis of random displacement models obtained using a logic tree structure will be a solution that
satisfy the available geodetic, deformation and tsunami data, observations and historical accounts.
This work is subdivided into two parts, an initial resolution test with synthetic deformation,
wave arrival and inundation data to test the capabilities and the response of the method to different
types of data availability, such as differences in data density (a large number of data points versus
sparse points), distribution (uniform distribution along the territory versus clusters of data or
just one cluster) or differences in availability of deformation or tsunami data. Once this test is
performed, the methodology will be applied to the case of the Valdivia earthquake.
Description
Thesis for the degree of Master of Sciences in Geophysics.
Keywords
Terremotos, Chile, 1960, Modelos Matemáticos, Terremotos, Análisis de Riesgo de Terremotos, 1960, Modelos Matemáticos, Modelos Matemáticos, Análisis de Riesgo de Terremotos, Modelos Matemáticos, Maremotos, Maremotos, Modelos Matemáticos, Modelos Matemáticos, Zonas de Subducción, Zonas de Subducción, Chile