Changeset 3244 for inundation/parallel


Ignore:
Timestamp:
Jun 27, 2006, 2:20:08 PM (18 years ago)
Author:
jack
Message:

Updated section re: running the code.

File:
1 edited

Legend:

Unmodified
Added
Removed
  • inundation/parallel/documentation/parallel.tex

    r3164 r3244  
    144144
    145145The first example in Section \ref{subsec:codeRPA} solves the advection equation on a
    146 rectangular mesh. A rectangular mesh is highly structured so a coordinate based decomposition can be use and the partitioning is simply done by calling the
     146rectangular mesh. A rectangular mesh is highly structured so a coordinate based decomposition can be used and the partitioning is simply done by calling the
    147147routine \code{parallel_rectangle} as shown below.
    148148\begin{verbatim}
     
    252252\section{Running the Code}
    253253\subsection{Compiling Pymetis and Metis}
    254 Currently, Metis and its Python wrapper Pymetis are not built by the
    255 \verb|compile_all.py| script. A makefile is provided to automate the build
    256 process. Change directory to the \verb|ga/inundation/pymetis/| directory and
    257 ensure that the subdirectory \verb|metis-4.0| exists and contains an
     254Unlike the rest of ANUGA, Metis and its Python wrapper Pymetis are not built
     255by the \verb|compile_all.py| script. A makefile is provided to automate the
     256build process. Change directory to the \verb|ga/inundation/pymetis/| directory
     257and ensure that the subdirectory \verb|metis-4.0| exists and contains an
    258258unmodified Metis 4.0 source tree. Under most varieties of Linux, build the
    259259module by running \verb|make|. Under x86\_64 versions of Linux, build the
     
    262262that the module works by running the supplied PyUnit test case with
    263263\verb|python test_metis.py|.
     264\subsection{Running the Job}
     265Communication between nodes running in parallel is performed by pypar, which
     266requires the following:
     267\begin{itemize}
     268\item Python 2.0 or later
     269\item Numeric Python (incling RandomArray) matching the Python installation
     270\item Native MPI C library
     271\item Native C compiler
     272\end{itemize}
     273Jobs are started by running appropriate commands for the local MPI
     274installation. Due to variations in MPI environments, specific details
     275regarding MPI commands are beyond the scope of this document. It is likely
     276that parallel jobs will need to be scheduled through some kind of queuing
     277system. Sample job scripts are available for adaptation in section
     278\ref{sec:codeSJ}. They should be easily adaptable to any queuing system
     279derived from PBS, such as TORQUE.
Note: See TracChangeset for help on using the changeset viewer.