Changeset 5743
- Timestamp:
- Sep 6, 2008, 10:34:12 PM (17 years ago)
- Files:
-
- 4 edited
Legend:
- Unmodified
- Added
- Removed
-
anuga_core/source/anuga/utilities/util_ext.h
r5306 r5743 281 281 double get_python_double(PyObject *O, char *name) { 282 282 PyObject *TObject; 283 #define BUFFER_SIZE 80 284 char buf[BUFFER_SIZE]; 283 285 double tmp; 286 int n; 284 287 285 288 … … 287 290 TObject = PyObject_GetAttrString(O, name); 288 291 if (!TObject) { 289 PyErr_SetString(PyExc_RuntimeError, "util_ext.h: get_python_double could not obtain double from object"); 292 n = snprintf(buf, BUFFER_SIZE, "util_ext.h: get_python_double could not obtain double %s.\n", name); 293 //printf("name = %s",name); 294 PyErr_SetString(PyExc_RuntimeError, buf); 295 290 296 return 0.0; 291 297 } … … 298 304 } 299 305 306 307 308 300 309 int get_python_integer(PyObject *O, char *name) { 301 310 PyObject *TObject; 302 int tmp; 311 #define BUFFER_SIZE 80 312 char buf[BUFFER_SIZE]; 313 long tmp; 314 int n; 303 315 304 316 … … 306 318 TObject = PyObject_GetAttrString(O, name); 307 319 if (!TObject) { 308 PyErr_SetString(PyExc_RuntimeError, "util_ext.h: get_python_integer could not obtain double from object"); 320 n = snprintf(buf, BUFFER_SIZE, "util_ext.h: get_python_integer could not obtain double %s.\n", name); 321 //printf("name = %s",name); 322 PyErr_SetString(PyExc_RuntimeError, buf); 309 323 return 0; 310 324 } 311 325 312 tmp = Py Float_AsDouble(TObject);326 tmp = PyInt_AsLong(TObject); 313 327 314 328 Py_DECREF(TObject); -
anuga_work/development/anuga_1d/dam_h_elevation.py
r5742 r5743 41 41 domain=Domain(points) 42 42 43 domain. default_order = 244 domain.set_timestepping_method(' euler')43 domain.order = 2 44 domain.set_timestepping_method('rk2') 45 45 #domain.default_time_order = 2 46 46 domain.cfl = 1.0 47 domain.limiter = "vanleer"47 #domain.limiter = "vanleer" 48 48 49 49 … … 110 110 domain.set_quantity('elevation', elevation_box) 111 111 #domain.set_quantity('xmomentum', xmom_sincos) 112 domain.order=domain.default_order112 #domain.order=domain.default_order 113 113 print "domain order", domain.order 114 114 … … 123 123 import time 124 124 t0=time.time() 125 yieldstep= 10.0 #30.0126 finaltime= 10.0 #20.0125 yieldstep=45.0 #30.0 126 finaltime=45.0 #20.0 127 127 print "integral", domain.quantities['stage'].get_integral() 128 128 for t in domain.evolve(yieldstep=yieldstep, finaltime=finaltime): -
anuga_work/development/anuga_1d/quantity.py
r5742 r5743 867 867 # (either from this module or C-extension) 868 868 #saxpy_centroid_values(self,a,b) 869 self.centroid_ backup_values[:] = (a*self.centroid_values + b*self.centroid_backup_values).astype('f')869 self.centroid_values[:] = (a*self.centroid_values + b*self.centroid_backup_values).astype('f') 870 870 871 871 class Conserved_quantity(Quantity): -
anuga_work/publications/boxing_day_validation_2008/modsim_article20_08.tex
r5486 r5743 23 23 %------Abstract-------------- 24 24 \begin{abstract} 25 Geoscience Australia, in an open collaboration with the Mathematical Sciences Institute, The Australian National University, is developing a software application, ANUGA, to model the hydrodynamics of floods, storm surges and tsunamis. The free source software implements a finite volume central-upwind Godunov method to solve the non-linear depth-averaged shallow water wave equations. In light of the renewed interest in tsunami forecasting and mitigation, this paper explores the use of ANUGA to model the inundation of the Indian Ocean tsunami of December 2004. The Method of Splitting Tsunamis (MOST) was used to simulate the initial tsunami source and the tsunami's propagation at depths greater than 100m. The resulting output was used to provide boundary conditions to the ANUGA model in the shallow water. Data with respect to 4-minute bathymetry, 2-minute bathymetry, 3-arc second bathymetry and elevation were used in the open ocean, shallow water and on land, respectively. A particular aim was to make use of the comparatively large amount of observed data corresponding to this event, including tide gauges and run-up heights, to provide a conditional assessment of the computational model's performance. Specifically we compared model tsunami depth with data collected at two tide gauges and 18 coastal run-up measurements. 25 Geoscience Australia, in an open collaboration with the Mathematical Sciences Institute, The Australian National University, is developing a software application, ANUGA, to model the hydrodynamics of floods, storm surges and tsunamis. The free source software implements a finite volume central-upwind Godunov method to solve the non-linear depth-averaged shallow water wave equations. In light of the renewed interest in tsunami forecasting and mitigation, this paper explores the use of ANUGA to model the inundation of the Indian Ocean tsunami of December 2004. The Method of Splitting Tsunamis (MOST) was used to simulate the initial tsunami source and the tsunami's propagation at depths greater than 100m. The resulting output was used to provide boundary conditions to the ANUGA model in the shallow water. Data with respect to 4-minute bathymetry, 2-minute bathymetry, 3-arc second bathymetry and elevation were used in the open ocean, shallow water and on land, respectively. A particular aim was to make use of the comparatively large amount of observed data corresponding to this event, including tide gauges and run-up heights, to provide a conditional assessment of the computational model's performance. Specifically we compared model tsunami depth with data collected at two tide gauges and 18 coastal run-up measurements. 26 26 27 27 Comparison between observed and modelled run-up at 18 sites show reasonable agreement. We also find modest agreement between the observed and modelled tsunami signal at the two tide gauge sites. The arrival times of the tsunami is approximated well at both sites. The amplitude of the first trough and peak is approximated well at the first tide gauge (Taphao-Noi), however the amplitude of the first wave was underestimated at the second gauge (Mercator yacht). The amplitude of subsequent peaks and troughs, at both gauges, are underestimated and a phase lag between the observed and modelled arrival times of wave peaks is evident after the first peak. … … 41 41 The process of validating the ANUGA application is in its early stages, but initial indications are encouraging. As part of the Third International Workshop on Long-wave run-up Models in 2004\footnote{http://www.cee.cornell.edu/longwave}, four benchmark problems were specified to allow the comparison of numerical, analytical and physical models with laboratory and field data. One of these problems describes a wave tank simulation of the 1993 Okushiri Island tsunami off Hokkaido, Japan (Matsuyama {\it et al.} 2001)\nocite{matsuyama01}. The wave tank simulation of the Hokkaido tsunami was used as the first scenario for validating ANUGA. The dataset provided bathymetry and topography along with initial water depth and the wave specifications. The dataset also contained water depth time series from three wave gauges situated offshore from the simulated inundation area. Although good agreement was obtained between the observed and simulated water depth at each of the three gauges (Roberts {\it et al.} 2006) \nocite{roberts06} further validation is needed. 42 42 43 Although appalling, the devastation caused by the 2004 Indian Ocean tsunami has heightened community, scientific and governmental interest in tsunami and in doing so has provided a unique opportunity for further validation of tsunami models. Enormous resources have been spent to obtain many measurements of phenomenon pertaining to this event to better understand the destruction that occurred. Data sets from seismometers, tide gauges, GPS stations, a few satellite overpasses, subsequent coastal field surveys of run-up and flooding and measurements from ship-based expeditions, have now been made available (Vigny {\it et al.} 2005, Amnon {\it et al.} 2005, Kawata {\it et al.} 2005, and Liu {\it et al.} 2005)\nocite{vigny05,amnon05,kawata05,liu05}. 43 Although appalling, the devastation caused by the 2004 Indian Ocean tsunami has heightened community, scientific and governmental interest in tsunami and in doing so has provided a unique opportunity for further validation of tsunami models. Enormous resources have been spent to obtain many measurements of phenomenon pertaining to this event to better understand the destruction that occurred. Data sets from seismometers, tide gauges, GPS stations, a few satellite overpasses, subsequent coastal field surveys of run-up and flooding and measurements from ship-based expeditions, have now been made available (Vigny {\it et al.} 2005, Amnon {\it et al.} 2005, Kawata {\it et al.} 2005, and Liu {\it et al.} 2005)\nocite{vigny05,amnon05,kawata05,liu05}. 44 44 45 45 An aim of this paper is to use ANUGA to undertake a regional case study of the 2004 Indian Ocean tsunami for western and southern Thailand. The specific intention is to test the model results against the observed data obtained during and in the aftermath of the tsunami. … … 48 48 49 49 \section{Modelling the Tsunami of 24th December 2004} 50 The evolution of earthquake-generated tsunamis has three distinctive stages: generation, propagation and run-up (Titov and Gonzalez, 1997) \nocite{titov97a}. To accurately model the evolution of a tsunami all three stages must be dealt with. Here we use the Method of Splitting Tsunamis Model (MOST) to model the generation of a tsunami and open ocean propagation. The resulting data is then used to provide boundary conditions for the inundation package ANUGA (see below) which is used to simulate the propagation of the tsunami in shallow water and the tsunami run-up. 50 The evolution of earthquake-generated tsunamis has three distinctive stages: generation, propagation and run-up (Titov and Gonzalez, 1997) \nocite{titov97a}. To accurately model the evolution of a tsunami all three stages must be dealt with. Here we use the Method of Splitting Tsunamis Model (MOST) to model the generation of a tsunami and open ocean propagation. The resulting data is then used to provide boundary conditions for the inundation package ANUGA (see below) which is used to simulate the propagation of the tsunami in shallow water and the tsunami run-up. 51 51 52 52 Here we note that the MOST model was developed as part of the Early Detection and Forecast of Tsunami (EDFT) project (Titov {\it et al.} 2005)\nocite{titov05}. MOST is a suite of integrated numerical codes capable of simulating tsunami generation, its propagation across, and its subsequent run-up. The exact nature of the MOST model is explained in (Titov and Synolakis 1995, Titov and Gonzalez 1997, Titov and Synolakis 1997, and Titov {\it et al.} 2005)\nocite{titov95,titov97a,titov97b,titov05}. … … 56 56 57 57 \subsection{Tsunami Generation} 58 The Indian Ocean tsunami of 2004 was generated by severe coseismic displacement of the sea floor as a result of one of the largest earthquakes on record. The M$_w$=9.2-9.3 mega-thrust earthquake occurred on the 26 December 2004 at 0h58'53'' UTC approximately 70 km offshore North Sumatra. The disturbance propagated 1200-1300 km along the Sumatra-Andaman trench time at a rate of 2.5-3 km.s$^{-1}$ and lasted approximately 8-10 minutes (Amnon {\it et al.} 2005)\nocite{amnon05}. At present ANUGA does not possess an explicit easy to use method for generating tsunamis from coseismic displacement, although such functionality could easily be added in the future. Implementing an explicit method for simulating coseismic displacement in ANUGA requires time for development and testing that could not be justified given the aims of the project and the time set aside for completion. Consequently in the following we employ the MOST model to determine the sea floor deformation. 58 The Indian Ocean tsunami of 2004 was generated by severe coseismic displacement of the sea floor as a result of one of the largest earthquakes on record. The M$_w$=9.2-9.3 mega-thrust earthquake occurred on the 26 December 2004 at 0h58'53'' UTC approximately 70 km offshore North Sumatra. The disturbance propagated 1200-1300 km along the Sumatra-Andaman trench time at a rate of 2.5-3 km.s$^{-1}$ and lasted approximately 8-10 minutes (Amnon {\it et al.} 2005)\nocite{amnon05}. At present ANUGA does not possess an explicit easy to use method for generating tsunamis from coseismic displacement, although such functionality could easily be added in the future. Implementing an explicit method for simulating coseismic displacement in ANUGA requires time for development and testing that could not be justified given the aims of the project and the time set aside for completion. Consequently in the following we employ the MOST model to determine the sea floor deformation. 59 59 60 60 The solution of Gusiakov (1972) \nocite{gusiakov72} is used by the MOST model to calculate the initial condition. This solution describes an earthquake consisting of two orthogonal shears with opposite sign. Specifically we adopt the parameterisation of Greensdale (2007) \nocite{greensdale07} who modelled the corresponding displacement by dividing the rupture zone into three fault segments with different morphologies and earthquake parameters. Details of the parameters associated with each of three regions used here are given in the same paper. The resulting sea floor displacement is shown in Figure \ref{fig:most_3_ruptures} and ranges between 3.6 m and 6.2 m. … … 70 70 71 71 \subsection{Tsunami Propagation} 72 We use the MOST model to simulate the propagation of the 2004 Indian Ocean tsunami in the deep ocean ocean, based on a discrete representation of the initial deformation of the sea floor, described above. Propagation is modelled using a numerical dispersion scheme that solves the non-linear shallow-water wave equations in spherical coordinates, with Coriolis terms. This model has been extensively tested against a number of laboratory experiments and was successfully used for simulations of many historical tsunamis (Titov and Synolakis 1997, Titov and Gonzalez 1997, Bourgeois {\it et al.} 1999, and Yeh {\it et al.} 1994)\nocite{titov97a,titov97b,bourgeois99,yeh94}. 72 We use the MOST model to simulate the propagation of the 2004 Indian Ocean tsunami in the deep ocean ocean, based on a discrete representation of the initial deformation of the sea floor, described above. Propagation is modelled using a numerical dispersion scheme that solves the non-linear shallow-water wave equations in spherical coordinates, with Coriolis terms. This model has been extensively tested against a number of laboratory experiments and was successfully used for simulations of many historical tsunamis (Titov and Synolakis 1997, Titov and Gonzalez 1997, Bourgeois {\it et al.} 1999, and Yeh {\it et al.} 1994)\nocite{titov97a,titov97b,bourgeois99,yeh94}. 73 73 74 74 The computational domain for the MOST simulation, was defined to extend from $79.067^0$E to $104.933^0$E and from $4.933^0$S to $24.867^0$S. The bathymetry in this region was estimated using a 4 arc minute data set developed by the CSIRO specifically for the ocean forecasting system used here. It is based on dbdb2 (NRL), and GEBCO data sets. The tsunami propagation incorporated here was modelled by the Bureau of Meteorology, Australia for six hours using a time step of 5 seconds (4320 time steps in total). 75 75 76 The output of the MOST model is produced for the sole purpose of providing an approximation of the tsunami's size and momentum that can be used to estimate the tsunami run-up. ANUGA could also have been used to model the propagation of the tsunami in the open ocean. The capabilities of the numerical scheme over such a large extent, however, have not been adequately tested. This issue will be addressed in future work. 76 The output of the MOST model is produced for the sole purpose of providing an approximation of the tsunami's size and momentum that can be used to estimate the tsunami run-up. ANUGA could also have been used to model the propagation of the tsunami in the open ocean. The capabilities of the numerical scheme over such a large extent, however, have not been adequately tested. This issue will be addressed in future work. 77 77 78 78 \subsection{Tsunami Inundation} … … 88 88 The domain was discretised into approximately 350,000 triangles. The resolution of the grid was increased in certain regions to efficiently increase the accuracy of the simulation. The grid resolution ranged between a maximum triangle area of $5\times 10^5$ m$^2$ near the Western ocean boundary to $500$ m$^2$ in the small regions surrounding the run-up points and tide gauges. The triangle size around islands and obstacles which "significantly affect" the tsunami was also reduced. The authors used their discretion to determine what obstacles significantly affect the wave through an iterative process. 89 89 90 The bathymetry and topography of the region was estimated using a data set produced by NOAA. Specifically the bathymetry was specified on a 2 arc minute grid (ETOPO2) and the topography on a 3 arc second grid. A penalised least squares technique was then used to interpolate the elevation onto the computational grid. 90 The bathymetry and topography of the region was estimated using a data set produced by NOAA. Specifically the bathymetry was specified on a 2 arc minute grid (ETOPO2) and the topography on a 3 arc second grid. A penalised least squares technique was then used to interpolate the elevation onto the computational grid. 91 91 92 92 \subsubsection{Boundary Conditions} … … 103 103 After the first arrival the wave signal begins to distort. The amplitude of the second crest predicted at both sites is smaller than those observed and subsequent peaks are out of phase. From Figure \ref{fig:tide_gauges2} it is evident that the amplitude of the wave at the Mercator yacht is underestimated. Grilli {\it et al.} (2006) \nocite{grilli06} also had difficulty reproducing the wave signal at the Mercator yacht after the arrival of the initial depression. 104 104 105 The distortions of the tsunami signal, at both sites, are most likely caused by misrepresentation of the water depths at and surrounding the observation points during the simulation. For example, Taphao-Noi Island, which greatly influences local wave patterns (reflections, resonance, interference etc) is not resolved by the local bathymetry data. A similar inaccuracy is manifest in the region surrounding the Mercator yacht, in which the undisturbed water depth was overestimated. When simulated using coarse bathymetry data, the undisturbed water depth at the Mercator yacht was 36m in comparison to an observed depth of 12m. 105 The distortions of the tsunami signal, at both sites, are most likely caused by misrepresentation of the water depths at and surrounding the observation points during the simulation. For example, Taphao-Noi Island, which greatly influences local wave patterns (reflections, resonance, interference etc) is not resolved by the local bathymetry data. A similar inaccuracy is manifest in the region surrounding the Mercator yacht, in which the undisturbed water depth was overestimated. When simulated using coarse bathymetry data, the undisturbed water depth at the Mercator yacht was 36m in comparison to an observed depth of 12m. 106 106 107 107 \begin{figure}[ht]
Note: See TracChangeset
for help on using the changeset viewer.