Changeset 3225
- Timestamp:
- Jun 26, 2006, 9:42:35 AM (18 years ago)
- File:
-
- 1 edited
Legend:
- Unmodified
- Added
- Removed
-
documentation/requirements/anuga_API_requirements.tex
r3060 r3225 48 48 \chapter{Public API} 49 49 \section{General principles} 50 The ANUGA API must be simple to use. 51 Operations that are conceptually simple should be easy to do. An example would be setting up a small test problem on a unit square without any geographic orientation. 52 Complex operations should be manageable and not require the user to enter information that isn't strictly part of the problem description. Examples are entering UTM coordinates (or geographic coordinates) as read from a map should not require any reference to a particular origin. 53 Nor should the same information have to be entered more than once per scenario. 50 51 The ANUGA API must be simple to use. Operations that are 52 conceptually simple should be easy to do. An example would be setting 53 up a small test problem on a unit square without any geographic 54 orientation. Complex operations should be manageable and not require 55 the user to enter information that isn't strictly part of the problem 56 description. Examples are entering UTM coordinates (or geographic 57 coordinates) as read from a map should not require any reference to a 58 particular origin. Nor should the same information have to be entered 59 more than once per scenario. 54 60 55 61 … … 58 64 59 65 Currently ANUGA is limited to UTM coordinates assumed to belong to one zone. 60 ANUGA shall throw an ex eption if this assumption is violated.66 ANUGA shall throw an exception if this assumption is violated. 61 67 62 68 It must be possible in general to enter data points as … … 77 83 not exist geographically. 78 84 79 \item Any component that needs coordinates to be relative to a particular point shall be responsible for deriving that origin. Examples are meshes where absolute coordinates may cause numerical problems. An example of a derived origin would be using the South-West most point on the boundary. 85 \item Any component that needs coordinates to be relative to a 86 particular point shall be responsible for deriving that 87 origin. Examples are meshes where absolute coordinates may cause 88 numerical problems. An example of a derived origin would be using 89 the South-West most point on the boundary. 80 90 81 91 \item Coordinates must be passed around as either geospatial objects … … 86 96 it is currently done, it doesn't have to be 87 97 changed as a matter of priority, but don't document this 'feature' 88 in the user manual. If you are refactoring this API, then pl ase98 in the user manual. If you are refactoring this API, then please 89 99 remove geo_reference as a keyword. 90 100 … … 94 104 95 105 \chapter{Internal API} 106 107 \section{Damage Model - Requirements} 108 Generally, damage model determines a percentage damage to a set of 109 structures and their content. 110 The dollar loss for each structure and its contents due to this damage 111 is then determined. 112 113 The damage model used in ANUGA is expected to change. The requirements 114 for this damage model is based on three algorithms. 115 116 117 \begin{itemize} 118 \item Probability of structural collapse. Given the distance from 119 the coast and the inundation height above ground floor, the 120 percentage probability of collapse is calculated. The distance from 121 the coast is 122 'binned' into one of 4 distance ranges. The height in binned into 123 one of 5 ranges. The percentage result is therefore represented by 124 a 4 by 5 array. 125 \item Structural damage curve. Given the type of building (X or Y) 126 and the inundation height above ground floor, the 127 percentage damage loss to the structure is determined. The curve is 128 based on a set of [height, percentage damage] points. 129 \item Content damage curve. Given the inundation height above 130 ground floor, the 131 percentage damage loss to the content of each structure is 132 determined. The curve is based on a set of [height, percentage 133 damage] points. 134 \end{itemize} 135 Interpolate between points when using the damage curves. 136 137 138 The national building exposure database (NBED) gives the following relevant 139 information for each structure; 140 \begin{itemize} 141 \item Location, Latitude and Longitude. 142 \item The total cost of the structure. 143 \item The total cost of the structures contents. 144 \item The building type (Have to check how this is given). 145 \end{itemize} 146 This information is given in a csv file. Each row is a structure. 147 148 So how will these dry algorithms (look-up tables) be used? 149 Given NBED, an sww and an assumed ground floor height the percent 150 structure and content loss and probability of collapse can be determined. 151 152 The probability of collapse will be used in a way to have each 153 structure either collapsed, or not not collapsed. There will not be 154 any 20\% probablity of collapse structures when calculating the damage 155 loss. 156 157 This is we will get either collapsed, or not not collapsed from a 158 probability of collapse; 159 \begin{itemize} 160 \item Count the number of houses (sample size) with each unique 161 probability of collapse (excluding 0). 162 \item probability of collapse * sample size = Number of collapsed 163 buildings (NCB). 164 \item Round the number of collapsed buildings. 165 \item Randomly 'collapse' NCB buildings, from the sample structures. 166 This is done by setting the \% damage loss to structures and content 167 to 100. This overrides losses calculated from the curves. 168 \end{itemize} 169 170 What is the final output? 171 Add these columns to the NBED file. 172 \begin{itemize} 173 \item \% content damage 174 \item \% structure damage 175 \item damage cost to content 176 \item damage cost to structure 177 \end{itemize} 178 179 How will the ground floor height be given? 180 Have it passed as a keyword argument, defaulting to .3. 181 182 \section{Damage Model - Design} 183 It has to be modular. In the future the three algorithms will be 184 combined to give a cumulative probability distribution, so this part 185 doesn't have to be designed to be too flexible. 186 96 187 97 188 … … 109 200 \item a function for re-assembling model output should be made available 110 201 \item scoping of methodologies for automatic domain decomposition 111 \item implementation of automatic domain decomposition 112 (using C extensions for maximal sequential performance in order to minimise202 \item implementation of automatic domain decomposition (using C 203 extensions for maximal sequential performance in order to minimise 113 204 performance penalties due to Amdahl's law) 114 \item in depth testing and tuning of parallel performan e. This may require205 \item in depth testing and tuning of parallel performance. This may require 115 206 adding wrappers for non-blocking MPI communication to pypar if needed. 116 207 \item ability to read in precomputed sub-domains. Perhaps using caching.py
Note: See TracChangeset
for help on using the changeset viewer.