Changeset 8191 for trunk/anuga_core/source/anuga_parallel/INSTALL-README
- Timestamp:
- Jul 7, 2011, 10:23:57 AM (13 years ago)
- File:
-
- 1 edited
Legend:
- Unmodified
- Added
- Removed
-
trunk/anuga_core/source/anuga_parallel/INSTALL-README
r8190 r8191 1 1 2 2 ========================= 3 3 INSTALLING anuga_parallel 4 ========================= 4 5 5 6 MPI 7 === 6 8 7 First thing is to install MPI on your system. OPENMPI and MPICH2 are supported. 9 First thing, you need to install MPI on your system. OPENMPI and MPICH2 10 are supported by pypar (see below) so both should be ok. 8 11 9 12 Make sure mpi works. You should be able to run a program in parallel. … … 16 19 17 20 PYPAR 21 ====== 18 22 19 We use pypar as the interface between mpi and python. T He most recent23 We use pypar as the interface between mpi and python. The most recent 20 24 version of PYPAR is available from 21 25 22 26 http://code.google.com/p/pypar/ 23 27 24 There is an old version on sourceforge25 http://sourceforge.net/projects/pypar/ 28 (There is an old version on sourceforge 29 http://sourceforge.net/projects/pypar/) 26 30 27 Should make sure the pypar examples work (and that it is in your python path) 31 Install pypar following the instructions in hte download. Should be able 32 use standard python setup.py install 33 34 Make sure the pypar examples work 28 35 29 36 30 In the anuga_parallel directory there is a subdirectory pymetis. You can 31 use the make file to compile. This should create a metis.so file in that 32 directory. From the pymetis directory, test using test_all.py 37 PYMETIS 38 ======= 33 39 34 Now go back to the anuga_parallel directory and run test_all.py 40 In the anuga_parallel directory there is a subdirectory pymetis. 41 42 Follow the instructions in README to install. Essentially just run make. 43 44 From the pymetis directory, test using test_all.py 45 46 ANUGA_PARALLEL 47 ============== 48 49 Should now be ready to run some parallel anuga code. 50 51 Go back to the anuga_parallel directory and run test_all.py 35 52 36 53 Hopefully that all works. 37 54 38 Then run run_parallel_sw_merimbula.py55 Run run_parallel_sw_merimbula.py 39 56 40 There are some choices in the script to use a small mesh, and a somewhat 41 larger mesh (the small one has 100 triangles the other 10000) 57 First just run it a sequential program. 42 58 43 Run that scriptusing a command like59 Then try a parallel run using a command like 44 60 45 61 mpirun -np 4 python run_parallel_sw_merimbula.py … … 49 65 You should look at the code in run_parallel_sw_merimbula.py 50 66 51 Essentially a fairly standard example, with the extra command domain= 52 distribute(domain) 67 Essentially a fairly standard example, with the extra command 68 69 domain = distribute(domain) 53 70 54 71 which sets up all the parallel stuff. 55 72 56 Also for efficiency reasons we only setup the original sequential mesh73 Also for efficiency reasons we only setup the original full sequential mesh 57 74 on processor 0, hence the statement 58 75 … … 64 81 65 82 The output will be an sww file associated to each 66 processor. Somewhere someone has created a script to combine these sww 67 files into one file. Will have to search for that. 83 processor. 68 84 69 Hope this helps 85 There is a script anuga/utilities/sww_merge.py which provides 86 a function to merge sww files into one sww file for viewing 87 with the anuga viewer.
Note: See TracChangeset
for help on using the changeset viewer.