#368 closed defect (worksforme)
ANUGA parallel leaking along domain portion edges
Reported by: | martins | Owned by: | steve |
---|---|---|---|
Priority: | normal | Milestone: | |
Component: | Architecture and API | Version: | |
Severity: | major | Keywords: | Parallel |
Cc: |
Description
When I run a simulation using multiple processors I get this effect along the domain portion edges that quickly spreads ruining the simulation.
Attachments (5)
Change History (12)
Changed 13 years ago by
Attachment: | ThreeSimulations.zip added |
---|
comment:1 Changed 13 years ago by
Tried to run your simulation but got the error
Traceback (most recent call last):
File "run_model_parallel.py", line 27, in <module>
from setup_model import project, trigs_min
ImportError?: No module named setup_model
Can you upload the extra modules needed to run your problem.
Wondering if the check pointing may be causing the problem. Without check pointing do you get the same error?
Changed 13 years ago by
Attachment: | TwoSimulationsNoCheck.zip added |
---|
Two 6 Cpu Simulations at different mesh resolutions
Changed 13 years ago by
Attachment: | ForSteve.zip added |
---|
comment:2 Changed 13 years ago by
Owner: | changed from ole to steve |
---|
comment:3 Changed 13 years ago by
Status: | new → assigned |
---|
Changed 13 years ago by
Attachment: | run_model_parallel.py added |
---|
Script with elevation set for the sequential domain
comment:4 Changed 13 years ago by
The problem was created by the setting of the elevation quantity independently on each of the parallel domains.
A proper fix would involve setting up a parallel set_quantity method which communicates the vertex values from the processors owning the vertices to the neighbouring processors.
But a quick fix is to ensure the set up the elevation is in the sequential part of the code, i.e. where the original domain is created on processor 0. See attached script.
This problem only occurs for the elevation, as the other important quantities (stage, x and y momentum) are updated in the evolve loop, but elevation is assumed to be continuous and is not updated.
There should be a method for testing the continuity of elevation in the parallel case.
comment:5 Changed 13 years ago by
Component: | Functionality and features → Architecture and API |
---|---|
Priority: | high → normal |
Changed 13 years ago by
Attachment: | Busselton.sww added |
---|
Some offshore effects along domain segment edges, real or viewer artefact?
comment:6 Changed 13 years ago by
Resolution: | → worksforme |
---|---|
Status: | assigned → closed |
comment:7 Changed 13 years ago by
With the latest version of the anuga_parallel code, there is a member function for the domain which will merge the parallel sww file.
Just put
domain.sww_merge()
after the evolve loop. (look at run_parallel_sw_merimbula.py in the anuga_parallel directory).
Can also use
domain.sww_merge(delete_old=True)
if you are brave and want to delete the individual sww files.
Look in anuga_parallel/test_parallel_frac_op.py
There is a parallel version of the Inlet_operator which must be imported from anuga_parallel.parallel_operator_factory
ie
from anuga_parallel.parallel_operator_factory import Inlet_operator
The same simulation run on one, two and three Cpus