{11} Active Tickets by Priority (Full Description) (46 matches)
List active tickets and group by ticket priority.
high (3 matches)
Ticket | Summary | Owner | Status | Type | Severity | Created | Modified | Component | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Description | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#352 | checkpointing progress to restore in case of an interrupted run | habili | new | enhancement | normal | Jun 30, 2010 | Feb 13, 2012 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Domain can safely be pickled, but boundaries that use functions or lambdas cannot be pickled. We need an alternative means of checkpointing ANUGA runs, possibly by separating the boundaries from the Domain so they will not be pickled along with the Domain. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#213 | Review vertex and edge limiter | steve | new | defect | normal | Nov 15, 2007 | Feb 13, 2012 | Efficiency and optimisation | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Will Power has identified a situation (periodic forcing) which seems to show that in that situation ANUGA demonstrates diffusive behaviour (first order in space). The problem seemed to be resolved when vertex limiting is turned off and second order timestepping is used. I claim that changing over to edge limiting should also provide a solution to the problem. Edge limiting should be implemented and tested for robustness. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#356 | compute_boundary_flows | steve | new | defect | normal | Oct 22, 2010 | Oct 22, 2010 | Architecture and API | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Shows up if you use all reflective boundaries. Then there should be zero boundary flows! |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
low (17 matches) |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Ticket | Summary | Owner | Status | Type | Severity | Created | Modified | Component | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#45 | Eliminate xya2pts from data_manager and use load mesh functionality instead | Leharne | new | defect | trivial | Jan 31, 2006 | Feb 13, 2012 | Architecture and API | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Affected files development/okushiri_2005/lwru2.py:from pyvolution.data_manager import xya2pts development/okushiri_2005/lwru2.py:xya2pts(project.bathymetry_filename, verbose = True, inundation/debug/mesh_error_reporting/show_mesh_bug.py:from pyvolution.data_manager import xya2pts inundation/debug/mesh_error_reporting/show_mesh_bug.py:xya2pts(filenames.bathymetry_filename, verbose = True, inundation/pyvolution/data_manager.py:def xya2pts(basename_in, basename_out=None, verbose=False, production/merimbula_2005/prepare.py: from pyvolution.data_manager import xya2pts production/merimbula_2005/prepare.py: xya2pts(project.bathymetry_filename, verbose = True) There are some file types that have the same information, but can be written in different formats. This are points files that can be pts(binary) or xya(ascii) and mesh files, that can be msh(binary) or tsh(ascii). All interfaces that use points or mesh files should be able to deal with both formats. This makes an xya2pts function redundant in most cases. Maybe it can be used for testing the points interface, to make sure that it can handle both binary or ASCII? That functionality is tested in the ?import_points_file? function. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#334 | Get ANUGA ready for Python 3.0 (aka py3k) | everyone | new | enhancement | minor | Jul 12, 2009 | Jun 17, 2010 | Compilation and installation | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
ANUGA was tested for Python2.6 on 13th July 2009 but we need to move towards Python3.0 which the goal of Py2.6. The way to see what needs to be done is to run ANUGA under Python2.6 using the -3 flag, e.g. python -3 test_all.py I started this process in changeset:7309 (the caching module). |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#254 | Allow functions used in set_quantity and also forcing functions to use normal absolute UTM without having to specify geo_reference object. | gareth | new | defect | normal | Mar 4, 2008 | Feb 13, 2012 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
In changeset:5090 the manual was updated to show how one should pass the geo_reference object into Polygon_function when used in set_quantity. This shouldn't be of any concern to the user of the ANUGA API and could be automated by doing the adjustment inside ANUGA. I suggest that all function in the public API use normal (absolute) UTM and when ANUGA calls them it will convert function values to relative coordinates for use internally. The user should not have to pass in anything. Thoughts are welcome. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#217 | Better error message if a boundary point lies outside the area defined in Field_boundary | jane | new | enhancement | normal | Nov 16, 2007 | Sep 11, 2009 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
This was suggested by Joaquim Luis and Rajaraman in Sourceforge postings November 2007. They had an sww file used with Field boundary which didn't quite cover the boundary segment. The old error message just said that NANs were found, but now why. This was confusing since the sww didn't contain NANs. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#323 | Replace nonstandard zone by central_meridian | leharne | new | enhancement | normal | Mar 24, 2009 | Feb 13, 2012 | Appearance and visualisation | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Now that we have the option of specifying the central meridian we (Leharne, Myall and Ole) think that we should get rid of the option of specifying an enforced zone to urs2sts and redfearn altogether. Anyone needing to project outside a zone can use central_meridian henceforth. One thing to keep in mind is how zone information is assigned to domain and stored in the sww file. With nonstandard meridian zone will be set to -1 |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#16 | Make sure that georeferencing is used consistently (domain, pmesh, sww files, ...) | nariman | new | defect | minor | Sep 20, 2005 | Sep 11, 2009 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
The georeferencing object isn't used consistently yet. Some code in source:inundation/pyvolution/data_manager.py is working directly with xllcorner and yllcorner Also make sure that georeferencing is tested everywhere (esp least squares) |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#149 | Consistent API for file boundaries | nariman | new | defect | minor | May 7, 2007 | Sep 8, 2009 | Architecture and API | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
File_boundary and Field_boundary currently takes the filename of an tms or sww file where as Transmissive_Momentum_Set_Stage_boundary takes a callable object (e.g. an File_function). Consistency would be nice. My feeling is that the latter model is the better one. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#280 | geospatial_data should know about hemispheres | nariman | new | enhancement | minor | Apr 30, 2008 | Sep 8, 2009 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Currently the set of points in geospatial_data are assumed to be in the Southern hemisphere. There should be an attribute so the hemisphere is known. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#158 | polygon information has no geo reference. | nariman | new | enhancement | normal | May 21, 2007 | Sep 8, 2009 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
We use a list of UTM coordinates to pass polygon infomation around. The geo-reference information for these points is not kept/associated with the list. Because of this the zone of the points is not known. This can cause problems. eg when trying to convert the info to lats and longs. To enhance ANUGA polygon info should be associated with geo reference info. Implimentation - Use a geospactial object to hold polygon info. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#272 | Write blocking csv2pts script | nariman | new | enhancement | normal | Apr 17, 2008 | Feb 13, 2012 | Efficiency and optimisation | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
ted Rigby has shown that ANUGA isn't performing very well if one wants to convert 18 million points in a csv file to the pts netcdf format. Currently, we do this by creating a Geospatial data object from the csv file and then exporting it to the pts format. The problem is that all data is stored in memory and this causes thrashing for large datasets. This problem is holding Ted back in running a validation against the Macquarie-Rivulaet catchment. One solution would be to make use of blocking so that the csv file isn't read until the export code is run - and then only in blocks. Another, and perhaps simpler, approach would be to write a conversion script, csv2pts, which does the conversion without ever having to store all data in memory. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#325 | Memory not being freed in fitting | nariman | new | defect | normal | Apr 2, 2009 | Sep 8, 2009 | Efficiency and optimisation | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
When a simulation run does not find data it can use in the cache it creates the cache data. This process allocates and doesn't free memory, impacting other simulations on the same node. We should investigate the fitting code and explicitly 'unreference' data structures that are maintained in the mesh structure. This could be done either in existing code that executes when we are finished the fitting, or we could add another method that explicitly frees structures no longer required. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#358 | Remove splitting of sww files | nariman | new | enhancement | normal | Feb 25, 2011 | Feb 13, 2012 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Now that large NetCDF format has been in place for a good while and the viewer has been update to use this, we can remove the need for splitting into chunks less than 2GB. I think this is in store_timestep() I propose doing this in two steps: 1: Increase the variable controlling the splitting to a number large enough to prevent it. Verify all tests and validations work. 2: Carefully remove the splitting algorithm in store_timestep() and the codes that were modified to observe splitted sww files. Verify all tests and validations. This supersedes ticket:257 which I have closed. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#223 | sww2domain is failing in (DISABLED)test_sww2domain2 | ole | reopened | defect | minor | Nov 21, 2007 | Sep 11, 2009 | Testing and validation | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Currently the parameter minimum_storable_height controls the smallest depth ANUGA is willing to store. However, Rudy van Drie noticed that momenta are stored for depths smaller than that resulting in sww files that have regions with nonzero momentum but zero depth. This is counter intuitive and causes problems if one wants to compute flow speed maps from sww files. Change this in data_manager.py so that momentum is set to zero in sww files whenever depth is less than minimum_storable_height. I assign this to Jane as Duncan will be busy with ticket:222 |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#338 | Sponge Boundary Condition | ole | new | enhancement | minor | Aug 26, 2009 | Aug 26, 2009 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
It would be useful to have a boundary condition that absorbs everything without being transmissive and without reflecting anything. Attached is a paper from former advisor Jesper Larsen which outlines a technique for creating an absorbing boundary condition. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#290 | Variable friction forcing term | ole | new | enhancement | normal | Jul 16, 2008 | Feb 13, 2012 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Implement forcing term that adds friction as a generalised and depth dependent function. Ted Rigby has the formulae and knowledge required. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#332 | Compress sww files on the fly | someone | new | enhancement | minor | Jun 19, 2009 | Jun 19, 2009 | Efficiency and optimisation | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
As sww files tend to get very big, ANUGA could perhaps compress them on the fly (much like Google Earth can create kmz instead of kml). The file format could be called swz. A test showed that they do compress Original sww 684,150kb Compressed folder by Windows XP 337,443kb Zipped by Zip Genius 236,883kb So there is some file size to save, but compressed sww files need to be handled by the other products like ANUGA Viewers, WateRide? and the Linux and 64 bit versions of ANUGA. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#110 | Move parallel stuff away from shallow_water_domain and above | steve | new | defect | minor | Nov 7, 2006 | Sep 9, 2009 | Appearance and visualisation | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
As far as possible, it would be good to avoid references to parallel concepts in the sequential code. Perhaps this could be achieved by more virtualisation of methods overridden in parallel_domain? |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
lowest (2 matches) |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Ticket | Summary | Owner | Status | Type | Severity | Created | Modified | Component | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#220 | generate mesh returns bad meshes for very small and very large maximum areas. | duncan | assigned | defect | minor | Nov 20, 2007 | Nov 21, 2007 | Architecture and API | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
When generating a mesh, very small (~0.0001) and very large(~1e12) maximum areas are being set to a max area of 1. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#281 | Implement Erosion/Deposition code | ole | new | enhancement | normal | Apr 30, 2008 | Feb 13, 2012 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Rudy van Drie has suggested that ANUGA is equipped with erosion & deposition capabilities. This ticket can be used to collate leads and links to that end. The uploaded file is from XBEACH and has the code for morphing the bed. Variable bedslope is identified in ticket:191 |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
normal (24 matches) |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Ticket | Summary | Owner | Status | Type | Severity | Created | Modified | Component | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#182 | More integration with GIS | Vanessa | new | enhancement | normal | Jul 3, 2007 | Feb 13, 2012 | Architecture and API | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
It would be good to allow ANUGA direct access to GIS files such as shape files, GML, KML and so on. This should probably be done through the GDAL library. Other tools that are based on GDAL are Quantum GIS (http://www.qgis.org/) and ogr2ogr (http://www.gdal.org/ogr/ogr2ogr.html). Quantum GIS is very promising as it allows very easy creation of polygons and has a light simple interface. It is unknown how well qgis handles projections. Ogr2ogr is commandline driven and can convert most formats. Two important examples are ogr2ogr -f "GML" map_area.gml map_area.shp which creates a gml file from a shape file. That file can easily be converted to csv if needed (however, it would be better to read the GDAL formats directly). ogr2ogr -f "KML" -t_srs WGS84 bay_area.kml bay_area.shp which converts a shape file in UTM coordinates to kml in WGS84 for use with Google Earth. First task in this ticket will be to see if Python can get direct access to the underlying libraries. One, called pymod, ships with ogr2ogr. DGAL may be the same thing, but I am not sure. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#243 | Diagnostics about compatibility of elevation data at boundary | gareth | new | enhancement | normal | Jan 10, 2008 | Feb 13, 2012 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
It'd be handy to provide diagnostics and perhaps a warning in case the elevation data in files used by File_boundary (and Field_boundary) aren't matching the elevation data used by the domain being evolved. In some cases small discrepancies can be safely ignored e.g. if they are small relative to the water depth, but in other cases, a small discrepancy can cause spurious flows. This may be the problem highlighted by Joaquim Luis in postings January 2008. Currently, File_boundary works only with conserved quantities, so elevation has to be obtained in order to do this. See changeset:4925 for a possible example of this |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#275 | Refactor set_values in quantity.py | habili | new | enhancement | normal | Apr 23, 2008 | Feb 13, 2012 | Architecture and API | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
This code needs a revamp. The semantics of 'location' and 'indices' isn't clear and we also need to have the option of restricting the area by specifying a polygon as requested by rajaraman 22 April 2008. I suggest we get rid of edge_values as an option and also to rename 'unique vertices' to the more intuitive name 'nodes' and let that be the default. This is what is set using point sets by least squares anyway. I the user needs to set vertex_values by triangle e.g. to include discontinuities it may be better to imply that by the shape of input value in case it is an array. I also suggest that we allow restrictions by region_tag in set_quantity to roll set_region in seemlessly. In fact set_region could probably be refactored using set_quantity with the polygon in question allowing for more code reuse. This ticket should be done in conjunction with ticket:250 and ticket:254 |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#211 | Use correct interpretation of xllcorner and yllcorner | leharne | new | defect | normal | Nov 12, 2007 | Sep 11, 2009 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
The current interpretation of xllcorner and yllcorner is wrong. As suggested by Joaquim Luis - see thread http://sourceforge.net/mailarchive/forum.php?thread_name=4727D555.7010208%40ualg.pt&forum_name=anuga-user
x0 = xllcorner + 0.5*cellsize or x0 = xllcenter and similarly for yllcorner. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#320 | zone stored in sww is always -1 | leharne | new | defect | normal | Mar 16, 2009 | Feb 13, 2012 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
It seems that domains don't get the zone info from data so sww files have zone=-1. It can easily be overridden by setting domain.geo_reference.zone manually, but it'd be best to get it from mesh and or data points. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#345 | STS boundary condition does not work with parallel ANUGA | leharne | new | defect | normal | Feb 16, 2010 | Feb 13, 2012 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
It appears that setting a File_boundary using an STS boundary file does not work when ANUGA is run in parallel. I am guessing at the following reason: The STS file is first used to create the mesh. When the file boundary is built for a subdomain indices of edges are taken to refer to the global ordering of edges along the boundary. For this to work with parallel ANUGA there has to be some mapping from the local indices of the sub domain boundary to the global ordering used by the STS file. This could even be achieved by checking the x, y coordinates of each edge midpoint whet the file boundary object is being built. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#210 | Automatically run realistic example in test suite | nariman | new | defect | normal | Nov 12, 2007 | Sep 11, 2009 | Testing and validation | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Need to run all demos used in the manual. This will also check things like loading from points file and realistic fitting procedures. This could be added to the validation suite. Also make fast version of something like the Cairns example (two timesteps, and smaller points file) to run within the unit test suite. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#239 | Forcing functions (or at least Inflow) should understand absolute coordinates | nariman | new | enhancement | normal | Dec 18, 2007 | Sep 11, 2009 | Architecture and API | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Currently all forcing functions expect spatial information in coordinates relative to the lower left origin of the computational mesh. It would be desirable to allow absolute UTM coordinates espectially when developing new forcing terms that are part of the public API such as rainfall, culverts etc. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#306 | Revisit starttime in boundaries, forcing terms and file_function. | nariman | new | defect | normal | Nov 24, 2008 | Sep 9, 2009 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Ted Rigby discovered that a hydrograph used with Transmissive_Momentum_Set_stage didn't use the real model time when domain.starttime > 0 Changing domain.time to domain.get_time in the evaluate method solves the problem, but opens the question about the general architecture. Currently, relative times are passed in (e.g. file_boundary) but changing this will touch many modules and will have to be done carefully. The overriding principle, though, should be that whenever a modeller references time it should be the absolute time (internal time + starttime) just as we intend with spatial coordinates. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#326 | Boundary object that allows for the 'addition' of two time series | nariman | new | enhancement | normal | May 15, 2009 | Feb 13, 2012 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
The development of the storm surge modelling capability will require the ability of ANUGA to take the outputs from an offshore storm surge model at the boundary (as what is done in the tsunami modelling case) and also apply a tidal forcing term. this means that the boundary term will need to take a .sts formatted deep water storm surge output and add a tidal forcing term. It's not clear how this forcing term is applied and whether it requires an input file, or whether it can be described by a function. This may indicate that having the functionality for a file or a function to be applied during the addition would be advantageous. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#186 | Variable rainfall forcing function | ole | new | enhancement | normal | Jul 25, 2007 | Jul 25, 2007 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Implement a forcing function for variable rainfall. Suggested requirements: Take as input an arbitrary number of time series each containing time (s) and rainfall (mm) in two columns. This is the same format we used for the rainfall test using one rainfall gauge. The time intervals might be different for each time series, i.e. some might be sampled every 300s while others could be sampled at other times. The intervals can also vary within one time series Then build a forcing function based on this data such that for each timestep t in ANUGA:
|
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#353 | couple SWAN model | ole | new | defect | normal | Sep 15, 2010 | Sep 15, 2010 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
To support storm surge inundation modelling capability. Require wind generated waves to be included. The SWAN model is widely used as can be downloaded from http://www.citg.tudelft.nl/live/pagina.jsp?id=f928097d-81bb-4042-971b-e028c00e3326&lang=en |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#354 | model wave setup | ole | new | enhancement | normal | Sep 15, 2010 | Sep 15, 2010 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
To support the storm surge inundation modelling capability. See report from WRL (draft delivered to GA, see TRIM 2010/1135) |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#355 | couple with OSS offshore storm surge models | ole | new | enhancement | normal | Sep 15, 2010 | Sep 15, 2010 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
To support the storm surge inundation modelling capability. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#364 | Allow ANUGA to read gis shapefiles to define the bounding polygon | ole | new | enhancement | normal | Mar 23, 2012 | Mar 23, 2012 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
From Gareth Davies: At present in the code, the user inputs polygons by typing something like Bounding_polygon = anuga.read_polygon(‘my_extent.csv’) where ‘my_extent.csv’ is a file with xy coordinate pairs (and no header), like this: x1, y1 x2, y2 x3, y3 x4, y4 What we would like is for the user to be able to type: bounding_polygon = anuga.read_polygon_shapefile(‘myshapefile.shp’) [where ‘myshapefile.shp’ is a polygon shapefile which ONLY contains a single polygon, which the user wants to use as a bounding polygon], and for the above command to have the same effect as the previous command. ## Things to consider are: 1) Should we support shapefiles with multiple polygons in them? I suggest no, because it will make it more complex to code and there is more potential for user error. However, this could be done. Perhaps better to have a warning or error if there is more than 1 polygon in the shapefile. 2) Should we check the orientation of the polygon? At present, I believe that ANUGA can have errors if all polygons are not oriented in the same way (i.e. clockwise or counter-clockwise). We could potentially ensure that the polygon was oriented clockwise (using a simple test like this one: http://paulbourke.net/geometry/clockwise/index.html, and re-ordering if needed). I suggest that this should either not be done, or optionally be done, with a warning if the vertices are being re-ordered. The reason is that one could imagine a situation where a user entered some polygons as csv files in a counterclockwise fashion, and then entered another polygon using a shapefile which had been ordered counter-clockwise – and the user would not wish for this polygon to be automatically re-ordered without a warning. 3) Should we support shapefiles that are not polygons (e.g. lines or points)? I think we should do this with a warning, however, we must assume that the ordering of the points is the same as it would be for a polygon. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#369 | ANU repository and GA seem to have duplication | ole | new | defect | normal | Jun 18, 2012 | Jun 18, 2012 | Appearance and visualisation | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
[anu repo]\anuga\misc\tools and \ga_repo\misc\tools seem to have the same files. The same is true for \anuga\anuga_work\development |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#372 | Solution - MKL FATAL ERROR when running anuga_core/test_all.py on the NCI | ole | new | defect | normal | Nov 14, 2012 | Nov 14, 2012 | Compilation and installation | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
When running anuga_core/test_all.py on the NCI and getting thos error: MKL FATAL ERROR: Cannot load neither libmkl_mc3.so nor libmkl_def.so load up the appropriate matplotlib module for your version of python e.g. for python 2.6 add: module load python/2.6-matplotlib to your logon script. (module load python/2.7.2-matplotlib for python 2.7.2 etc. Make sure to check with the command "module avail" is your python version has an associated matplotlib module.) |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#252 | Transmissive_Momentum_Set_Stage boundary may cause numerical instabilities | steve | new | defect | major | Feb 28, 2008 | Aug 25, 2008 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
This boundary condition was implemented to allow momentum to be automatically derived from the underlying numerical scheme and it was instrumental in getting the Okushiri validation to work. However, it appears that numerical instabilities may arise from this in cases where reflected waves flow back into this boundary. This is not unlike the instabilities we encountered with the fully transmissive boundary. The example https://datamining.anu.edu.au/anuga/browser/anuga_work/debug/bad_time_step/run_dam.py has been boiled down to the basics and demonstrates this phenomenon. The instability occurs at the point where the reflected wave interacts with the boundary. The diagnostic output confirms that huge momentum values build up and that the large speeds resulting are not due to very shallow water. Coarsening the mesh seems to help, but it would be good to address the underlying issue. Good luck! |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#102 | Get true boundaries of parallel domains | steve | new | defect | normal | Oct 19, 2006 | Sep 11, 2009 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Parallel domains are equipped with 'ghost' triangles. It would be useful to have access to the true boundary and somehow hook get_boundary_polygon() into that rather than giving the extended boundary. One way would be to build the local meshes temporarily (as is done already) and generate a boundary_polygon to be stored with the parallel_domain. Then override get_boundary_polygon() to return that. Here's information from Linda (18/10/6) on how to get access to the individual meshes: Hi, The triangles, nodes etc are stored in a dictionary, eg submesh["full_nodes"], submesh["full_triangles"], submesh["full_boundary"]. To obtain, say the triangles for submesh 1 use submesh["full_triangles"][1]. The full_... entries are the submeshes without the ghosts. The triangle structure is the same as the standard structure, but the node structure has been extended to include the global ID to local ID mapping (I have not removed any information just extended it). If you plan to pass the these boundary polygons around as an additional structure, you will need to also update build_commun.py and change build_local.py to change the global ID to a local ID once the information has been sent to the processor. If all that you need is the nodes and triangles (you do not need to know any neighbour information) you may want to look at build_local.py. build_local is suppose to be run on each processor so the triangles and nodes for that processor are stored in submesh["full_nodes"] and submesh["full_triangles"]. The advange here is that you will not need to update the communication calls. You still have to be careful becasue the information in submesh["full_.."] is originally stored in terms of the global ID, it is not until I get to build_local_GA that the global IDs are mapped to the local IDs. Linda On Wed, 18 Oct 2006, Ole.Nielsen@ga.gov.au wrote: > Hi Linda > > Thanks for mail. > > I understand that you are busy. > Perhaps all I need is a hint on how to do it. > > If you can tell me where in the code I have access to the partitioned > meshes in terms of their nodes (Nx2, Float) and triangles (Mx3, Int). > I was looking in build_submesh, I am unsure if all the partitioned > triangles are lumped into one structure. How can I get access to the > individual submeshes (without ghosts) in terms of nodes and triangles? > > > If you can tell me that, I'll be right to do it. > > Cheers > Ole > > > --------------------------------------------------------------------- > Dr. Ole Nielsen | Computational Scientist > Geospatial and Earth Monitoring Division | E: Ole.Nielsen@ga.gov.au > Geoscience Australia | P: +61 2 6249 9048 > Canberra, Australia | F: +61 2 6249 9986 > --------------------------------------------------------------------- > > > -----Original Message----- > From: Linda Stals [mailto:stals@maths.anu.edu.au] > Sent: Wednesday, 18 October 2006 9:36 > To: Nielsen Ole > Cc: stephen.roberts@anu.edu.au; u4112456@anu.edu.au > Subject: RE: Mesh partitioning > > Hi, > > I will put it on my TODO list, but I have the end-of-semester and > paper due so may take a while to bubble up to the top. > > Linda > > > On Tue, 17 Oct 2006, Ole.Nielsen@ga.gov.au wrote: > >> Hi again Linda >> >> >EE 361 1 -0.475 >> >Ee 361 1 -0.455140483872 >> >> >Am I reading something wrong? >> >> Probably a flow on effect. >> Anyways, I works when I moved the boundary allocation to AFTER the >> distribution which makes heaps of sense in retrospect. I know I can >> make boundaries independent of the domain and will do so later on. >> >> >> Meanwhile I have one request for some help. >> >> I would love to be able to get the true boundary polygon for each >> mesh (available as get_boundary_polygon() in the normal meshes). I >> have started doing in build_submesh.py but am unsure how to get that >> for EACH submesh. Can you help? >> >> Cheers and thanks >> Ole >> |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#214 | Discontinuous Bathymetry | steve | new | enhancement | normal | Nov 15, 2007 | Sep 9, 2009 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Discontinuous Bathymetry should be implemented. This would help to deal with steep slopes such as found at the edge of the continental shelf and also in riverine flooding applications. Also useful if we implement 2 or multilayer models. Will need to update the flux calculation and gravity term. It might be best to remove gravity as a forcing term and couple it with the basic flux calculations. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#215 | Well Balanced implementation of Flux and Gravity term | steve | new | enhancement | normal | Nov 15, 2007 | Nov 30, 2007 | Efficiency and optimisation | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
It is evident in zero flow situations that the flux and gravity terms are perhaps not well balanced, or that the method used to deal with wet and dry interface, induces unrealistic flows. Problem should be investigated to ascertain solution and implement solution. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#273 | Implement Kinematic Viscosity | steve | new | enhancement | normal | Apr 17, 2008 | Sep 9, 2009 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Ted and Rudy to provide formula for KV forcing term. Tests will show eddy formations. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#340 | 1-D pipe flow forcing term | steve | new | enhancement | normal | Sep 9, 2009 | Sep 9, 2009 | Appearance and visualisation | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
A proper 1-D flow model would supersede previous (and unfinished) work on culverts. See closed ticket:145 for more info |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
#374 | Consistent use of csv files | steve | new | enhancement | normal | Jan 10, 2013 | Jan 10, 2013 | Functionality and features | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Just wondering if there is any scope for you to fix the way we import a time series into ANUGA. At the moment for a hydrograph or rainfall, we have to create a space separated time series (.asc format) and then convert it to a binary (.tms format) before we can call it. This used to be the case with your digital terrain data where one would have to create a csv (x,y,z) and then convert it to a .pts file so ANUGA could read it in. Now we can just specify the .csv and ANUGA does it during the intial setup stage which is better as there is less user input. Is there any chance of getting all the other time series inputs (eg boundary, rainfall, hydrographs) to be done the same way? So the user would just create a .csv of their rainfall (or hydrograph or tide) data and instead of specifying a .tms file, ANUGA just reads the .csv and converts it to a .tms during the set up stage? |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||