Changeset 6240


Ignore:
Timestamp:
Jan 29, 2009, 4:16:09 PM (15 years ago)
Author:
jakeman
Message:

updated patong validation paper. Introduction motivates need for new validation benchmark. Section 2 describes data sets needed. Remaining sections only contains brief outline of what should be written

File:
1 edited

Legend:

Unmodified
Added
Removed
  • anuga_work/publications/boxing_day_validation_2008/patong_validation.tex

    r6084 r6240  
    2323%------Abstract--------------
    2424\begin{abstract}
    25 Geoscience Australia, in an open collaboration with the Mathematical Sciences Institute, The Australian National University, is developing a software application, ANUGA, to model the hydrodynamics of tsunamis, floods and storm surges. The open source software implements a finite volume central-upwind Godunov method to solve the non-linear depth-averaged shallow water wave equations. This paper investigates the veracity of ANUGA  when used to model tsunami inundation.  A particular aim was to make use of the comparatively large amount of observed data corresponding to the Indian ocean tsunmai event of December 2004, to provide a conditional assessment of the computational model's performance. Specifically a comparison is made between an inundation map, constructed from observed data, against modelled maximum inundation. This comparison shows that there is very good agreement between the simulated and observed values. The sensitivity of model results to the resolution of bathymetry data used in the model was also investigated. It was found that the performance of the model could be drastically improved by using finer bathymetric data which better captures local topographic features. The effects of two different source models was also explored.
     25
    2626\end{abstract}
    2727%======================Section 1=================
    28 Notes:  * Model source developed independently of inundation data.
    29         * Patong region was chosen because high resolution inundation map and bathymetry and topography data was available there
    3028
    3129\section{Introduction}
    32 Tsunamis are a potential hazard to coastal communities all over the world. These `waves' can cause loss of life and have huge social and economic impacts. The Indian Ocean tsunami killed around 230,000 people and caused billions of dollars in damage on the 26th of December 2004 (Synolakis {\it et al.} 2005). Hundreds of millions of dollars in aid has been donated to the rebuilding process and still the lives of hundreds of thousands of people will never be the same. Fortunately, catastrophic tsunamis of the scale of the 26 December 2004 event are exceedingly rare (Jankaew et al. 2008). However, smaller-scale tsunamis are more common and regularly threaten coastal communities around the world. Earthquakes that occur in the Java Trench near Indonesia (e.g. Tsuji {\it et al.} 1995) and along the Puysegur Ridge to the south of New Zealand (e.g. Lebrun {\it et al.} 1998) have potential to generate tsunamis that may threaten Australia's northwestern and southeastern coastlines.\nocite{synolakis05,tsuji95,lebrun98}
    33 
    34 For these reasons there has been an increased focus on tsunami hazard mitigation over the past three years. Tsunami hazard mitigation involves detection, forecasting, and emergency preparedness (Synolakis {\it et al.} 2005). Unfortunately, due to the small time scales (at the most a few hours) over which tsunamis take to impact coastal communities, real time models that can be used for guidance as an event unfolds are currently underdeveloped. Consequently current tsunami mitigation efforts must focus on developing a database of pre-simulated scenarios to help increase effectiveness of immediate relief efforts. Firstly areas of high vulnerability, such as densely populated regions at risk of extreme damage, are identified. Action can then be undertaken before the event to minimise damage (early warning systems, breakwalls etc.) and protocols put in place to be followed when the flood waters subside. In this spirit, Titov {\it et al.} (2001)\nocite{titov01} discuss a current Short-term Inundation Forecasting (SIFT) project for tsunamis.
    35 
    36 Several approaches are currently used to solve these problems. They differ in the way that the propagation of a tsunami is described. The shallow water wave equations, linearised shallow water wave equations, and Boussinesq-type equations are commonly accepted descriptions of flow. But the complex nature of these equations and the highly variable nature of the phenomena that they describe necessitate the use of numerical simulations.
    37 
    38 Geoscience Australia, in an open collaboration with the Mathematical Sciences Institute, The Australian National University, is in the final stages of completing a hydrodynamic modelling tool called ANUGA to simulate the shallow water propagation and run-up of tsunamis. Further development of this tool requires comprehensive assessment of the model. In particular the model must be validated and tested to ensure it is sufficiently robust and that the interactions and outcomes demonstrated are feasible and defensible, given the objectives. These objectives include: simulating flow over dry beds and the appearance of dry states within previously wet regions; accurately describing steady state flows and small perturbations from these steady states over rapidly-varying topography; and accurately resolve shocks. Applications of ANUGA include, but are not limited to, dam-breaks, storm surges, and tsunami propagation.
    39 
    40 The process of validating the ANUGA application is in its early stages, but initial indications are encouraging. As part of the Third International Workshop on Long-wave run-up Models in 2004\footnote{http://www.cee.cornell.edu/longwave}, four benchmark problems were specified to allow the comparison of numerical, analytical and physical models with laboratory and field data. One of these problems describes a wave tank simulation of the 1993 Okushiri Island tsunami off Hokkaido, Japan (Matsuyama {\it et al.} 2001)\nocite{matsuyama01}. The wave tank simulation of the Hokkaido tsunami was used as the first scenario for validating ANUGA. The dataset provided bathymetry and topography along with initial water depth and the wave specifications. The dataset also contained water depth time series from three wave gauges situated offshore from the simulated inundation area. Although good agreement was obtained between the observed and simulated water depth at each of the three gauges (Roberts {\it et al.} 2006) \nocite{roberts06} further validation is needed.
     30Tsunami are a potential hazard to coastal communities all over the world. A number of recent large events have increased community and scientific awareness of the need for effective tsunami hazard mitigation. Tsunami modelling is major component of hazard mitigation, which involves detection, forecasting, and emergency preparedness (Synolakis {\it et al.} 2005). Accurate models can be used to provide information that increases the effectiveness of action undertaken before the event to minimise damage (early warning systems, breakwalls etc.) and protocols put in place to be followed when the flood waters subside.
     31
     32Several approaches are currently used to model tsunami propagation and inundation. These methods differ in both the formulation used to describe the evolution of the tsunami and the numerical methods used to solve the governing equations. The shallow water wave equations, linearised shallow water wave equations, and Boussinesq-type equations are commonly accepted descriptions of flow. The complex nature of these equations and the highly variable nature of the phenomena that they describe necessitate the use of numerical models. These models are typically used to predict quantities such as arrival times, wave speeds and heights and inundation extents which are used to develop efficient hazard mitigation plans. Inaccuracies in model prediction can result in inappropriate evacuation plans and town zoning which may result in loss of life and large financial losses. Consequently tsunami models must undergo sufficient testing to increase scientific and community confidence in the model predictions.
     33
     34Complete 100\% confidence in a model of a physical system cannot be proven. One can only show that the model does not fail under certain conditions. However, the utility of a model can be assessed through a process of validation and verification. Validation assesses the accuracy of the numerical method used to solve the governing equations and verification is used to investigate whether the model adequately represents the physical system. %Verification must be used to reduce numerical error before validation is used to assess model structure. In some situations it may be possible to increase the numerical accuracy of a model and produce a worse fit of the observed data.
     35
     36The sources of data used to validate and verify a model can be separated into three main categories, analytical solutions, scale experiments and field measurements. Analytical solutions of the governing equations of a model, if available, provide the best means of validating a numerical hydrodynamic model. The solutions provide spatially and temporally distributed values of important observables that can be compared against modelled results. However analytical solutions to the governing equations are frequently limited to a small set of idealised examples that do not completely capture the more complex behaviour of 'real' events. Scale experiments, typically in the form of wave-tank experiments provide a much more realistic source of data that better captures the complex dynamics of natural tsunami, whilst allowing control of the event and much easier and accurate measurement of the tsunami properties. However comparison of numerical predictions with field data provides the most stringent test of model veracity. The use of field data increases the generality and significance of conclusions made regarding model utility. However the use of field data also significantly increase the uncertainty of the validation experiment that may constrain the ability to make unequivacol statements~\cite{lane94}.
     37
     38Currently the amount of tsunami related field data is limited. The cost of tsunami monitoring programs and bathymetry and topography surveys prohibits the collection of data in many of the regions in which tsunamis pose greatest threat. The resulting lack of data has limited the number of field data sets available to validate tsunami models, particularly those modelling tsunami inundation. Synolakis et. al~\cite{synolakis07} have developed a set of standards, criteria and procedures for evaluating numerical models of tsunami. They propose three analytical solutions to help identify the validity of a model and  five scale comparisons (wave-tank benchmarks) and two field events to assess model veracity.  The two field data benchmarks are very useful but only capture a small subset of possible tsunami behaviours and only one of the benchmarks can be used to validate tsunami inundation. The type ans size of a tsunami source, propagation extent, and local bathymetry and topography all affect the energy, waveform and subsequent inundation of a tsunami. Consequently additional field data benchmarks that further capture the variability and sensitivity of the real world system would be useful to allow model developers verify their models and subsequently use their models with greater confidence.
     39
     40In this paper we develop a field data benchmark to be used in conjuction with the other tests proposed by Synolakis et al. to validate and verify tsunami inundation. The benchmark is constructed from data collected around Patong Bay, Thailand during and immeadiately following the 2004 Indian Ocean tsunami tsunami. This area was chosen because the authors were able to obtain unusually high resolution bathymetry and topography data in this area and an extensive inundation map generated from a survey performed in the aftermath of the tsunami. A description of this data is give in Section\ref{sec:data}.
     41
     42An associated aim of this paper is to illustrate the use of this new benchmark to validate a operational tsunami model. The specific intention is to test the ability of ANUGA to reproduce the inundation survey of maximum runup. ANUGA is a hydrodynamic modelling tool used to simulate the tsunami propagation and run rain-induced floods. The components of ANUGA are discussed in Secion\ref{sec:ANUGA}.
     43
     44%=================Section=====================
     45
     46\section{Indian Ocean tsunami of 24th December 2004}
     47Although appalling, the devastation caused by the 2004 Indian Ocean tsunami has heightened community, scientific and governmental interest in tsunami and in doing so has provided a unique opportunity for further validation of tsunami models. Enormous resources have been spent to obtain many measurements of phenomenon pertaining to this event to better understand the destruction that occurred. Data sets from seismometers, tide gauges, GPS stations, a few satellite overpasses, subsequent coastal field surveys of run-up and flooding and measurements from ship-based expeditions, have now been made available (Vigny {\it et al.} 2005, Amnon {\it et al.} 2005, Kawata {\it et al.} 2005, and Liu {\it et al.} 2005)\nocite{vigny05,amnon05,kawata05,liu05}. A number of studies have utilised this data to calibrate models of the tsunami source\cite{grilli07} , match tide gauge recordings\cite{}, maximum wave heights~\cite{asavanant08} and runup locations~\cite{ioualalen07}. We propose to use this event as an additional field-data benchmark for verification of tsuanmi models. This event captures certain tsunami behaviours that are not present in the benchmarks proposed by Synolakis et. al~\cite{synolakis07}.
     48
     49Synolakis detail two field data banchmarks. The first test compares model results gainst observed data from the Hokkaido-Nansei-Oki tsunami that occurred around Okushiri Island, Japan on the 12th of July 1993. This tsunami provides an example of extreme runup generated from reflections and constructive interference resulting from local topography and bathymetry. The benchmark consists of two tide gauge records and numerous spatially distributed point sites at which maximum runup elevations were observed. The second becnhmark is based upon the Rat Islands Tsunami that occured off the coast of Alaska on the 17th of November 2003. Rat island tsunami provides a good test for real-time forecasting models since tsunami was recorded at three tsunameters. The test requires matching the propagation model data with the DART recording to constrain the tsunami source model and using a propgation model to to reproduce the tide gauge record at Hilo.
     50
     51%The tsunamis used by the two standard benchmarks and the 2004 tsunami are quite different. They all arise from cosiesmic displacement resulting from an earthquake, however they all occur in very different geographical regions. The Hokkaido-Nansei-Oki tsunami was generated by an earthquake with a magnitude of 7.8 and only travelled a small distance before inundating Okishiri Island. The event provides an example of extreme runup genereated from reflections and constructive interference resulting from local topography and bathymetry. In comparison the Rat islands tsunami was generated by an earthquake of the same magnitude but had to travel a much greater distance. The event provides a number of tide gauge recordings that capture the change in wave form as the tsunami evolved.
     52
     53The 2004 December tsunami was a much larger event than the previous two described. It was generated by a disturbance, resulting from a M$_w$=9.2-9.3 mega-thrust earthquake, that propagated 1200-1300 km. Consequently the energy of the resulting wave was much larger than the waves generated from the the more localised and smaller magnitude aforementioned events. WAS THE WAVELENGTH< VELOCITY (and thus average ocean depth) DIFFERENT FROM THESE TWO EVENTS??? If so state something like. This larger wavelength and energy and simply the different geology of the area produced different a wave signal and different pattern of inundation. Here we foucs on the larage inundation experienced at Patong bay on the West coast of Thaliand.
     54
     55\section{Data}
     56Hydrodynamic simulations require very little data in comparison to models of many other environemental systems. Tsunami models typically only require baythymetry and topography data to approximate the local geography, paramterisation of the tsunami source from which appropriate initial conditions can be generated, and a locally distributed quantity such as manning's friction coefficient to approximate friction. Here we dicuss the bathymetric and topographical data sets and source condition that are necessary to implement the proposed benchmark. Friction is discussed in Section~\ref{sec:}
     57
     58The Patong Bay and surrounding region is source to an unusually large amount of data, pertaining to the 2004 tsunami, which is necessary for tsunami verification. The authours obtained a number of raw data sets which were analysed and checked for quality (QCd) and subsequently gridded for easier visualisation and input into tsunami models.
     59
     60\subsection{Bathymetric and topographic data}
     61The two minute arc grid data set, DBDB2, was obtained from US Naval Research Labs and used to approximate the bathymetry in the Bay of Bengal. This grid was further interpolated to a 27 second arc grid. In the Andaman Sea we replaced the DBDB2 data with a a 3 second grid obtained from NOAA. Finally a 1 second grid was used to approximate the bathymetry in Patong Bay and the immeadiately adjacent regions. This elevation data was created from the digitised Thai Navy bathymetry chart, no 358. A visualisation of the topography data set used in Patong bay is shown in Figure~\ref{fig:patong_bathymetry}. The continuous topgraphy is an interpolation of the 1 second grid created for this area from the known elevation measured at the coloured dots.
     62
     63The sub-sampling of larger grids was performed by using {\bf resample}  a GMT program. The gridding of data was performed using {\bf Intrepid} a commercial geophysical processing package developed by Intrepid Geophysics. The gridding scheme employed the nearest neighbour algorithm followed by and application of minimum curvature akima spline smoothing.
     64
    4165
    4266\begin{figure}[ht]
    4367\begin{center}
    44 \includegraphics[width=8.0cm,keepaspectratio=true]{monai-gauge-05-new.png}
    45 \caption{Comparison of ANUGA simulation against the wave tank simulation of the 1993 Okushiri Island tsunami off Hokkaido, Japan}
    46 \label{fig:most_3_ruptures}
     68\includegraphics[width=8.0cm,keepaspectratio=true]{patong_bay_data.jpg}
     69\caption{Is there a new picture with river included???}
     70\label{fig:patong_bathymetry}
    4771\end{center}
    4872\end{figure}
    4973
    50 Although appalling, the devastation caused by the 2004 Indian Ocean tsunami has heightened community, scientific and governmental interest in tsunami and in doing so has provided a unique opportunity for further validation of tsunami models. Enormous resources have been spent to obtain many measurements of phenomenon pertaining to this event to better understand the destruction that occurred. Data sets from seismometers, tide gauges, GPS stations, a few satellite overpasses, subsequent coastal field surveys of run-up and flooding and measurements from ship-based expeditions, have now been made available (Vigny {\it et al.} 2005, Amnon {\it et al.} 2005, Kawata {\it et al.} 2005, and Liu {\it et al.} 2005)\nocite{vigny05,amnon05,kawata05,liu05}.
    51 
    52 An aim of this paper is to use this relative abundance of observed data corresponding to this event to further validate the use of ANUGA for modelling the inundation of tsunami.  The specific intention is to test the ability of the model to reproduce an inundation survey of maximum runup constructed in the aftermath of the 2004 tsunami. A further aim is to test the sensitvity of the model predictions to bathymetry and tsunami source used.
    53 %=================Section=====================
    54 
    55 \section{Modelling the Tsunami of 24th December 2004}
    56 The evolution of earthquake-generated tsunamis has three distinctive stages: generation, propagation and run-up (Titov and Gonzalez, 1997) \nocite{titov97a}. To accurately model the evolution of a tsunami all three stages must be dealt with. Here we investigate the use of two different source models, URS and the Method of Splitting Tsunamis Model (MOST) and  to model the generation of a tsunami and open ocean propagation. The resulting data is then used to provide boundary conditions for the inundation package ANUGA (see below) which is used to simulate the propagation of the tsunami in shallow water and the tsunami run-up.
    57 
    58 \begin{figure}
    59 \begin{center}
    60 \includegraphics[width=3.0in,keepaspectratio=true]{3stages.jpg}
    61 \end{center}
    62 \caption{Triangular elements in the 2D finite volume method.}
    63 \label{fig:2dmesh}
    64 \end{figure}
    65 
    66 
    67 Here we note that the MOST model was developed as part of the Early Detection and Forecast of Tsunami (EDFT) project (Titov {\it et al.} 2005)\nocite{titov05}. MOST is a suite of integrated numerical codes capable of simulating tsunami generation, its propagation across, and its subsequent run-up. The exact nature of the MOST model is explained in (Titov and Synolakis 1995, Titov and Gonzalez 1997, Titov and Synolakis 1997, and Titov {\it et al.} 2005)\nocite{titov95,titov97a,titov97b,titov05}.
    68 
    69 ANUGA is an inundation tool that solves the depth integrated shallow water wave equations. The scheme used by ANUGA, first presented by Zoppou and Roberts (1999)\nocite{zoppou99}, is a high-resolution Godunov-type method that uses the rotational invariance property of the shallow water equations to transform the two-dimensional problem into local one-dimensional problems. These local Riemann problems are then solved using the semi-discrete central-upwind scheme of Kurganov {\it et al.} (2001) \nocite{kurganov01} for solving one-dimensional conservation equations. The numerical scheme is presented in detail in (Zoppou and Roberts 1999, Zoppou and Roberts 2000, and Roberts and Zoppou 2000, Nielsen {\it et al.} 2005) \nocite{zoppou99,zoppou00,roberts00,nielsen05}. An important capability of the software is that it can model the process of wetting and drying as water enters and leaves an area. This means that it is suitable for simulating water flow onto a beach or dry land and around structures such as buildings. It is also capable of adequately resolving hydraulic jumps due to the ability of the finite-volume method to handle discontinuities.
    70 
    71 
    72 \subsection{Tsunami Generation}
    73 The Indian Ocean tsunami of 2004 was generated by severe coseismic displacement of the sea floor as a result of one of the largest earthquakes on record. The M$_w$=9.2-9.3 mega-thrust earthquake occurred on the 26 December 2004 at 0h58'53'' UTC approximately 70 km offshore North Sumatra. The disturbance propagated 1200-1300 km along the Sumatra-Andaman trench time at a rate of 2.5-3 km.s$^{-1}$ and lasted approximately 8-10 minutes (Amnon {\it et al.} 2005)\nocite{amnon05}. At present ANUGA does not possess an explicit easy to use method for generating tsunamis from coseismic displacement, although such functionality could easily be added in the future. Implementing an explicit method for simulating coseismic displacement in ANUGA requires time for development and testing that could not be justified given the aims of the project and the time set aside for completion. Consequently in the following we employ the URS model and the MOST model to determine the sea floor deformation.
    74 
    75 The URS code uses a source model based on Wang (Wang et al. 2003) which is an elastic crustal model. The source parameters used to simulate the 2004 Indian Ocean Tsunami
    76 were taken from Chlieh (2007). The resulting sea floor displacement ranges from about - 5.0 to 5.0 metres and is shown in figure 3.
    77 
    78 
    79 The solution of Gusiakov (1972) \nocite{gusiakov72} is used by the MOST model to calculate the initial condition. This solution describes an earthquake consisting of two orthogonal shears with opposite sign. Specifically we adopt the parameterisation of Greensdale (2007) \nocite{greensdale07} who modelled the corresponding displacement by dividing the rupture zone into three fault segments with different morphologies and earthquake parameters. Details of the parameters associated with each of three regions used here are given in the same paper. The resulting sea floor displacement is shown in Figure \ref{fig:most_3_ruptures} and ranges between 3.6 m and 6.2 m.
     74\subsection{Tsunami source}
     75The Indian Ocean tsunami of 2004 was generated by severe coseismic displacement of the sea floor as a result of one of the largest earthquakes on record. The M$_w$=9.2-9.3 mega-thrust earthquake occurred on the 26 December 2004 at 0h58'53'' UTC approximately 70 km offshore North Sumatra. The disturbance propagated 1200-1300 km along the Sumatra-Andaman trench time at a rate of 2.5-3 km.s$^{-1}$ and lasted approximately 8-10 minutes (Amnon {\it et al.} 2005)\nocite{amnon05}.
     76
     77Many parameterisations of the 2004 tsunami source are available. Some are determeined from various geolocial surveys of the site, others solve an inverse problem which calibrates the source based upon the tsunami wave signal and or runup. Although possibly producing a closer match between observed and simulated data, the later later is in approporaite for use by this benchmark. The data used to calibrate the model needs to be independent of the validation data. The source parameters used to simulate the 2004 Indian Ocean Tsunami were taken from Chlieh (2007). HOW IS SOURCE PARAMTERISED. FROM GEOGRAPHICAL STUDY OF INVERSE PROBLEM TRYING TO MATCH WAVE SIGNAL. DOES ANYONE HAVE A COPY THEY COULD SEND ME PLEAESE? The resulting sea floor displacement ranges from about - 5.0 to 5.0 metres and is shown in Figure~\ref{fig:chlieh_slip_model}.
    8078
    8179\begin{figure}[ht]
     
    8381\includegraphics[width=8.0cm,keepaspectratio=true]{chlieh_slip_model.png}
    8482\caption{Location and magnitude of the sea floor displacement associated with the December 24 2004 tsunami. Source parameters taken from Chlieh {\it et al.} (2007)}
    85 \label{fig:most_3_ruptures}
     83\label{fig:chlieh_slip_model}
    8684\end{center}
    8785\end{figure}
    8886
    89 
    90 \subsection{Tsunami Propagation}
    91 We use both the URS model and the MOST model to simulate the propagation of the 2004 Indian Ocean tsunami in the deep ocean ocean, based on a discrete representation of the initial deformation of the sea floor, described above.
     87\subsection{Inundation survey data}
     88The bathymetry data and source parameterisation can be inserted into the tsunami model and run. From the simulation runup and ocean surface elevation can be obtained. We propose that a `correct' tsunami model should reproduce the inundation map shown in Figure~\ref{fig:patongescapemap}. Furthermore the model should simulate a leading depression followed by 3??? crests. Is there any eye witness accounts of how many waves arrived a patong???
     89
     90\begin{figure}[ht]
     91\begin{center}
     92\includegraphics[width=8.0cm,keepaspectratio=true]{patongescapemap.jpg}
     93\caption{Map of maximum inundation at Patong bay.}
     94\label{fig:patongescapemap}
     95\end{center}
     96\end{figure}
     97
     98
     99\section{Verification Procedure}
     100
     101%=================Section=====================
     102
     103\subsection{ANUGA}
     104ANUGA is an inundation tool that solves the depth integrated shallow water wave equations. The scheme used by ANUGA, first presented by Zoppou and Roberts (1999)\nocite{zoppou99}, is a high-resolution Godunov-type method that uses the rotational invariance property of the shallow water equations to transform the two-dimensional problem into local one-dimensional problems. These local Riemann problems are then solved using the semi-discrete central-upwind scheme of Kurganov {\it et al.} (2001) \nocite{kurganov01} for solving one-dimensional conservation equations. The numerical scheme is presented in detail in (Zoppou and Roberts 1999, Zoppou and Roberts 2000, and Roberts and Zoppou 2000, Nielsen {\it et al.} 2005) \nocite{zoppou99,zoppou00,roberts00,nielsen05}. An important capability of the software is that it can model the process of wetting and drying as water enters and leaves an area. This means that it is suitable for simulating water flow onto a beach or dry land and around structures such as buildings. It is also capable of adequately resolving hydraulic jumps due to the ability of the finite-volume method to handle discontinuities.
     105
     106
     107\subsection{Tsunami Souce and Propagation}
     108We use the URS model to simulate the propagation of the 2004 Indian Ocean tsunami in the deep ocean ocean, based on a discrete representation of the initial deformation of the sea floor, described above.
    92109
    93110The URS code models the propagation of the tsunami in deep water using the finite difference method to solve the non-linear shallow water equations in
    94111spherical co-ordinates with friction and coriolis terms. The code is based on Satake (1995) with significant modifications made by the URS corporation
    95112(Thio et al. 2007) and Geoscience Australia (Burbidge et al. 2007). The tsunami is propagated via a stagered grid system starting with coarser grids
    96 and ending with the finest one. The URS code is also capable of calculating inundation.
    97 
    98 Most models the propogation of the tsunami using a numerical dispersion scheme that solves the non-linear shallow-water wave equations in spherical coordinates, with Coriolis terms. This model has been extensively tested against a number of laboratory experiments and was successfully used for simulations of many historical tsunamis (Titov and Synolakis 1997, Titov and Gonzalez 1997, Bourgeois {\it et al.} 1999, and Yeh {\it et al.} 1994)\nocite{titov97a,titov97b,bourgeois99,yeh94}.
    99 
    100 The computational domain for the MOST simulation, was defined to extend from $...$E to $...$E and from $...$S to $...$S. The bathymetry in this region was estimated using ...
     113and ending with the finest one. The URS code is also capable of calculating inundation. CAN WE PRODUCE AN INUNDATINO MAP OVER THE SAME AREA TO COMPARE WITH ANUGA???
     114
     115The computational domain for the URS simulation, was defined to extend from $...$E to $...$E and from $...$S to $...$S. The bathymetry in this region was estimated using ...
    101116%a 4 arc minute data set developed by the CSIRO specifically for the ocean forecasting system used here. It is based on dbdb2 (NRL), and GEBCO data sets. The tsunami propagation incorporated here was modelled by the Bureau of Meteorology, Australia for six hours using a time step of 5 seconds (4320 time steps in total).
    102117
    103 The output of the URS and MOST models were produced for the sole purpose of providing an approximation of the tsunami's size and momentum that can be used to estimate the tsunami run-up. ANUGA could also have been used to model the propagation of the tsunami in the open ocean. The capabilities of the numerical scheme over such a large extent, however, have not been adequately tested. This issue will be addressed in future work.
     118The output of the URS model was produced for the sole purpose of providing an approximation of the tsunami's size and momentum that can be used to estimate the tsunami run-up. ANUGA could also have been used to model the propagation of the tsunami in the open ocean. The capabilities of the numerical scheme over such a large extent, however, have not been adequately tested. This issue will be addressed in future work.
    104119
    105120\subsection{Tsunami Inundation}
    106121The utility of the URSGA model decreases with water depth unless an intricate sequence of nested grids is employed. On the other hand, while the ANUGA model is less suitable for earth quake source modelling and large study areas, it is designed with detailed on-shore inundation in mind.
    107 Consequently, the Geoscience Australia tsunami modelling methodology is based on a hybrid approach using models like URSGA (or the MOST model) for tsunami generation and propagation up to a 100m depth contour where the wave is picked up by ANUGA and propagated on shore using the finite-volume method on unstructured triangular meshes.
     122Consequently, the Geoscience Australia tsunami modelling methodology is based on a hybrid approach using models like URSGA for tsunami generation and propagation up to a 100m depth contour where the wave is picked up by ANUGA and propagated on shore using the finite-volume method on unstructured triangular meshes.
    108123
    109124In this case the open ocean boundary of the ANUGA study area was chosen to roughly follow the 100m depth contour along the west coast of Phuket Island.
     
    112127\begin{center}
    113128\includegraphics[width=8.0cm,keepaspectratio=true]{new_domain.png}
    114 \caption{Computational Domain. Can we easily create picture like this one for our new scenario}
     129\caption{Computational Domain. CAN WE CREATE A PICTURE LIKE THIS FOR OUR NEW SCENARIO}
    115130\label{fig:computational_domain}
    116131\end{center}
     
    119134The domain was discretised into approximately ...,000 triangles. The resolution of the grid was increased in certain regions to efficiently increase the accuracy of the simulation. The grid resolution ranged between a maximum triangle area of $...\times 10^5$ m$^2$ near the Western ocean boundary to $...$ m$^2$ in the small regions surrounding the run-up points and tide gauges. The triangle size around islands and obstacles which "significantly affect" the tsunami was also reduced. The authors used their discretion to determine what obstacles significantly affect the wave through an iterative process.
    120135
    121 The bathymetry and topography of the region was estimated using...% a data set produced by NOAA. Specifically the bathymetry was specified on a 2 arc minute grid (ETOPO2) and the topography on a 3 arc second grid. A penalised least squares technique was then used to interpolate the elevation onto the computational grid.
    122 
    123 \subsubsection{Boundary Conditions}
    124 The boundary of the computational domain comprises N=... linear segments. Those segments which lie entirely on land were set as reflective boundaries. The segments that lie in depths greater than 50m were set as Dirichlet boundary conditions with the stage (water elevation) equal to zero. Finally all other segments were time varying boundaries. The value at these boundaries was interpolated from the estimates of the wave depth and momentum obtained from the URS and MOST simulation.
    125 
    126 \subsection{Bathymetric and Topographic Data}
     136
     137%================Section======================
     138\section{Results}
    127139\begin{figure}[ht]
    128140\begin{center}
    129 \includegraphics[width=8.0cm,keepaspectratio=true]{patong_bay_data.jpg}
    130 \caption{Is there a new picture with river included???}
    131 \label{fig:computational_domain}
     141\includegraphics[width=8.0cm,keepaspectratio=true]{Patong_0_8lowres.jpg}
     142\caption{Simulated inundation versus observed inundation}
     143\label{fig:inundationcomparison}
    132144\end{center}
    133145\end{figure}
    134146
    135 
    136 Both the source models MOST and ... require the input of bathymetric data desribing the geometry of the sea floor. The data used ...
    137 
    138 
    139 
    140 
    141 %================Section======================
    142 \section{Results}
    143 
    144 
    145 Table \ref{tab:run-up_locations}  also highlights the misrepresentation of the local coastline. Large discrepancies, in the order of metres, exist between the modelled and observed elevation. Furthermore, three run-up observation sites were deemed to be initially underwater. This suggests that results could be improved further by employing finer bathymetric data when it becomes available. Yet, despite the poor bathymetric data there is still a moderate correlation between the observed and modelled run-up values suggesting that local variations in the energy of the tsunami are being approximated reasonably well.
    146 
    147147%================Section=====================
    148148
     
    150150We have simulated the inundation of the tsunami of a small irregular region of the west Thailand coast surrounding Phuket using the inundation tool ANUGA. The tsunami size and position at the boundaries of this region were estimated using the MOST model which was used to simulate the generation and propagation of the tsunami in the deep ocean. Specifically the parameterisation of Greensdale {\it et al.} (2007) \nocite{greensdale07} was used to describe the tsunami source and the subsequent wave elevation and momentum required by the inundation simulation were interpolated from the MOST simulation at each time step.
    151151
    152 Comparison between observed and modelled run-up at 18 sites show reasonable agreement. We also find a modest agreement between the observed and modelled tsunami signal at the two tide gauge sites. The arrival times of the tsunami is approximated well at both sites. The amplitude of the first trough and peak is approximated well at the first tide gauge (Taphao-Noi), however the amplitude of the first wave was underestimated at the second gauge (Mercator yacht).  The amplitude of subsequent peaks and troughs, at both gauges, are underestimated and a phase lag between the observed and modelled arrival times of wave peaks is evident after the first peak. Grilli {\it et al.} (2006) \nocite{grilli06} also could not reproduce the correct arrival time at the Taphao-Noi tide gauge or reproduce the signal at the Mercator yacht.
    153 
    154 The performance of the model could be improved by using finer bathymetric data, which at present cannot be obtained by the authors, and by a more accurate estimation of the initial tsunami source. The wave height observed at a particular point along the coast is strongly influenced by relatively small scale bathymetric and coastal features which may be under-resolved by the current computational mesh or poorly represented by the sparse bathymetry and topography data set. These problems may also cause errors in simulated arrival times in coastal areas adjacent to regions consisting of inaccurate bathymetry data. Titov and Gonzalez (1997) \nocite{titov97a} state that for most cases 10-50m horizontal resolution of bathymetry data is essential. As mentioned above we could only obtain 2 arc minute (~3.6km) bathymetry which is most likely insufficient. Topography is approximated using a 3 arc second (~90m) grid which is much more appropriate. However, when combined, these data sets do not reproduce the position of coastline well. If a finer resolved bathymetric data set could be obtained for the shallow waters of the Thai coast (say in regions with important bathymetric features) a much better result could be expected.
    155 
    156 The approximation of the tsunami source also affects the near shore amplitude of the tsunami wave. As the graphs and tables above show, the amplitude of the tsunami is at times misrepresented and this is partly due to an suboptimal reproduction of the initial coseismic displacement. Grilli {\it et al.} (2006) \nocite{grilli06} obtain improved reproduction of tsunami amplitude when they optimise the parameters of the tsunami source based on the model's ability to reproduce certain observed behaviour. We would like to think, and will explore, that it is this optimisation that yields more accurate results rather than any deficiency of the ANUGA model.
    157152
    158153%================Acknowledgement===================
     
    268263Digitised Thai Navy bathymetry chart no 358.
    269264
    270 The sub-sampling of larger grids was performed by using \bold resample \endbold  a GMT program.
    271 The gridding of data was performed using \bold Intrepid \endbold a commercial geophysical processing package developed by Intrepid Geophysics.
     265The sub-sampling of larger grids was performed by using {\bf resample}  a GMT program.
     266The gridding of data was performed using {\bf Intrepid} a commercial geophysical processing package developed by Intrepid Geophysics.
    272267The gridding scheme was nearest neighbour followed by minimum curvature akima spline smoothing.
    273268
     
    286281\end{document}
    287282
    288 We can never prove that a model of a physical system is correct only that it does not fail under certain conditions.
    289 A model must be verified and validation. The former is the process of indentifying whether the numerical solver used produces an accurate solution of the governing equaations. The later is used to assess whether the model adequately represents the physcial system. This is achived by comparing the model resutls with physical measurements/observed data and theory. Sometimes coincidence will mean that a less numerically accurate solution can match the measured data more closely than a more numerically accurate one. So it is necessary to first reduce numerical error through the verification processs and then assess modelling errors. lane
    290 
    291 Is the difference between modelled resutls and observations a result of of poor model process and representation and numerics or poor model paramteterisation horrit
     283
     284Main source of uncertainty arises from inaccuracies in initial condition (source), inaccurate bathymetry data, to a lesser extent friction
     285
     286single experiment can refute model but cannot validate it. Need as many tests as possible to be confident in rpediction. Question arises. How mnay should we do. With finite experiments more weight should be given to a particular experiment if the range of the inout function and the material properties are both broad so that the universal character of the model is tested.
     287
     288Expressions:
     289sufficient verification/falsification of model
     290Confidently utilise a model
     291
     292Predictive valdiation of only one aspect of model evaluation. Need to assess model explanation.
     293
     294Conservation of mass
     295convergence
     296
     297spatial and temporal discretisation errors, round off errors due to limited numerical precision
     298
     299analytical benchmarking:
     300ensuring equations are solved accurately
     301single wave on a beach
     302Solitary wave on composite beach
     303subaerial landslide on simple beach
     304
     305Analytical solutions only represent idealised and simplfied events that do not fully capture the complexity of 'real' flows. Provide temporally and spatially distributed data that field data can rearely match.
     306
     307scale comparisions (laboratory benchmarking):
     308Scale differences are not belived to be important. scale experiments generally do not have same bootom firction characteristics as real scenario but has not proven to be a problem. The long wavelngth of tsunami tends to mean that the friction is less important in comparison to the motion of the wave
     309Single wave on a simple beacj
     310Solitary wave on composite beach
     311Conical island
     312Monai Valley
     313Landslide
     314
     315includes comparisons with validation data sets generated by other models of higher dimensionality and resolution.
     316
     317Often flow geometries are simplified
     318
     319
     320Field benchmarking:
     321Most important verification process
     322Hydrodynamic inversion to predict the source is an ill posed problem
     32312 July 1993 Hokkaido-Nansei-Oki tsunami around Okushiri Island Japan exreme runup height of 31.7m was found at the tip of a narrow gulley with the small cove at Monai
     32417 November 2003 Rat Islands Tsunami
     325
     326Construction of more than one model can reveal biases in a single model. Two types of comparisons 1 between those that are comceptually simailar and those that re different. In former case interested in how choice of numerical solver and discretisation effects results and the later can help determine the level of physical processs representation necessary to represent an observed data set.
     327
     328Movinf to field data increases the gnereality and siginificance of svientifice evidence obatined. However we also significantly incerase the uncertainty of the validation experioment that may constrain the ability to make unequivacol statments. E.g. in bathymetry source condition friction.
     329
     330Calibratino of the model is often used to compensate for uncertainty in the model inputs. Calibartion results in a further loss of experimental control as a unique solution may not exist.
     331
     332verfication need to assess point data, spatially distributed data and bulk (lumped) data.
     333
     334Synolakis et. al~\cite{synolakis07} detail two field events that have been previoulsy used to validate tsunami models, the Hokkaido-Nansei-Oki tsunami that occured around Okushiri Island, Japan on 2nd of July 1993  and the Rat Islands Tsunami that inundated the occured off the coast of Alaska on the 17th of November 2003.
     335
     336
     337inundation map only useful if mesh and topography resolution fine enough hard to measure what the model predicts how deep does inundation need to be for it to be visible during a field study
     338
     339Notes:
     340Okushiri provides an example of extreme runup genereated from reflections and constructive interference resulting from local topography and bathymetry. Numerous point sites at which runup elevations were observed are available.  The highest runup of 31.7 m in a valley north of Monai needs to be approximated with the numerical model. In addition, two tide gage records at Iwanai and Esashi need to be estimated.
     341
     342
     343
     344Rat island tsuanmi provides a good test for real-time forecasting models since tsnumai was recorded at three tsunameters. The test requires matching the propagation model data with the DART recording to constrain the tsunami source model. The inundation model is to reproduce the tide gauge record at Hilo.
     345
     346Patong Bay benchamark provides spatially distributed field data for comparison.
     347
     348single experiment can refute model but cannot validate it. Need as many tests as possible to be confident in prediction. Question arises. How mnay should we do.
     349
     350DO I SAY WE HAVE MUX @ FILES DESCRIBING SHAPE OF WAVE YES. MAKES CONSISTENT
     351
     352Notes:  * Model source developed independently of inundation data.
     353        * Patong region was chosen because high resolution inundation map and bathymetry and topography data was available there
     354
     355Geoscience Australia, in an open collaboration with the Mathematical Sciences Institute, The Australian National University, is developing a software application, ANUGA, to model the hydrodynamics of tsunamis, floods and storm surges. The open source software implements a finite volume central-upwind Godunov method to solve the non-linear depth-averaged shallow water wave equations. This paper investigates the veracity of ANUGA  when used to model tsunami inundation.  A particular aim was to make use of the comparatively large amount of observed data corresponding to the Indian ocean tsunmai event of December 2004, to provide a conditional assessment of the computational model's performance. Specifically a comparison is made between an inundation map, constructed from observed data, against modelled maximum inundation. This comparison shows that there is very good agreement between the simulated and observed values. The sensitivity of model results to the resolution of bathymetry data used in the model was also investigated. It was found that the performance of the model could be drastically improved by using finer bathymetric data which better captures local topographic features. The effects of two different source models was also explored.
     356
     357different even types submarine mass failure generate larger events because of proximity more directional wave generation
     358
     359even if data is available it is hard to access
     360
     361article={ioualalen07,
     362title={Modeling the 26 December 2004 Indian Ocean tsunami: Case study of impact in Thailand},
     363author=-{Ioualalen, M. and Asavanant, J. and  Kaewbanjak, N. and Grilli, S.~T. and Kirby, J.~T. and Watts, P.},
     364year={2007},
     365journal ={ J. Geophys. Res.},
     366volume={112},
     367doi={http://dx.doi.org/10.1029/2006JC003850}
     368}
     369
     370article={hirata06
     371title={The 2004 Indian Ocean tsunami: Tsunami source model from satellite altimetry},
     372author={Hirata, K. and Satake, K. and Tanioka, Y. and  Kuragano, T. and Hasegawa, Y. and   Hayashi, Y. and Hamada, N.},
     373journal={Earth, Planets and Space}
     374year={2006},
     375volume={58},
     376number={2},
     377pages={195--201}
     378}
     379
     380@InBook{asavanant08,
     381ALTauthor = {Asavanant, J. and  Ioualalen, M. and Kaewbanjak, N. and Grilli, S.~T. and Watts, P. and Kirby, J.~T. and Shi, F.},
     382ALTeditor = {},
     383title = {Modeling, Simulation and Optimization of Complex Processes},
     384chapter = {Numerical Simulation of the December 26, 2004: Indian Ocean Tsunami },
     385publisher = {   Springer Berlin Heidelberg},
     386year = {2008},
     387pages = {59--68},
     388}
     389
     390@article{grilli07,
     391author = {St\'{e}phan T. Grilli and Mansour Ioualalen and Jack Asavanant and Fengyan Shi and James T. Kirby and Philip Watts},
     392title = {Source Constraints and Model Simulation of the December 26, 2004, Indian Ocean Tsunami},
     393publisher = {ASCE},
     394year = {2007},
     395journal = {Journal of Waterway, Port, Coastal, and Ocean Engineering},
     396volume = {133},
     397number = {6},
     398pages = {414-428},
     399url = {http://link.aip.org/link/?QWW/133/414/1},
     400doi = {10.1061/(ASCE)0733-950X(2007)133:6(414)}
     401}
Note: See TracChangeset for help on using the changeset viewer.