Difference between revisions of "GetDDM"

From ONELAB
Jump to: navigation, search
Line 23: Line 23:
  
 
* Configure and build PETSc. The configuration options depend on the calculations you want to perform (complex- or real-valued), as well as your compiler/MPI setup. Make sure that '''mpicc''', '''mpicxx''' and '''mpif90''' are in your path. Then run (remove <code>--with-scalar-type=complex</code> to build in real arithmetic):
 
* Configure and build PETSc. The configuration options depend on the calculations you want to perform (complex- or real-valued), as well as your compiler/MPI setup. Make sure that '''mpicc''', '''mpicxx''' and '''mpif90''' are in your path. Then run (remove <code>--with-scalar-type=complex</code> to build in real arithmetic):
<blockquote><syntaxhighlight lang="bash" enclose="div">
+
<source>
 
cd petsc-3.4.4
 
cd petsc-3.4.4
 
export PETSC_DIR=$PWD
 
export PETSC_DIR=$PWD
Line 30: Line 30:
 
make
 
make
 
cd ..
 
cd ..
</syntaxhighlight></blockquote>
+
</source>
  
 
* Download and unzip the Gmsh and the GetDP source code from the ONELAB bundle  http://onelab.info/files/gmsh-getdp-source.zip
 
* Download and unzip the Gmsh and the GetDP source code from the ONELAB bundle  http://onelab.info/files/gmsh-getdp-source.zip
  
 
* Configure, compile and install a minimal Gmsh library (it will be used by GetDP):
 
* Configure, compile and install a minimal Gmsh library (it will be used by GetDP):
<blockquote><syntaxhighlight lang="bash" enclose="div">
+
<source>
 
cd gmsh-xxx
 
cd gmsh-xxx
 
mkdir lib
 
mkdir lib
Line 43: Line 43:
 
sudo make install/fast
 
sudo make install/fast
 
cd ../..
 
cd ../..
</syntaxhighlight></blockquote>
+
</source>
  
 
* Configure and compile the MPI version of GetDP (change CC, CXX and FC depending on your MPI installation):
 
* Configure and compile the MPI version of GetDP (change CC, CXX and FC depending on your MPI installation):
<blockquote><syntaxhighlight lang="bash" enclose="div">
+
<source>
 
cd getdp-xxx
 
cd getdp-xxx
 
mkdir bin
 
mkdir bin
Line 53: Line 53:
 
make
 
make
 
cd ../..
 
cd ../..
</syntaxhighlight></blockquote>
+
</source>
  
 
* [''Optional - only for parallel mesh generation''] Configure and compile the MPI version of Gmsh (need to disable Metis due to version clash with PETSc):
 
* [''Optional - only for parallel mesh generation''] Configure and compile the MPI version of Gmsh (need to disable Metis due to version clash with PETSc):
<blockquote><syntaxhighlight lang="bash" enclose="div">
+
<source>
 
cd gmsh
 
cd gmsh
 
mkdir bin
 
mkdir bin
Line 62: Line 62:
 
cmake -DENABLE_MPI=1 -DENABLE_METIS=0 ..
 
cmake -DENABLE_MPI=1 -DENABLE_METIS=0 ..
 
make
 
make
</syntaxhighlight></blockquote>
+
</source>
  
 
== Parallel runs ==
 
== Parallel runs ==
  
 
The commands for running GetDDM in parallel will depend on your particular MPI setup. Basically, the procedure would look something like this on 100 CPUs:
 
The commands for running GetDDM in parallel will depend on your particular MPI setup. Basically, the procedure would look something like this on 100 CPUs:
<blockquote><syntaxhighlight lang="bash" enclose="div">
+
<source>
 
mpirun -np 100 gmsh file.geo -
 
mpirun -np 100 gmsh file.geo -
 
mpirun -np 100 getdp file.pro -solve DDM  
 
mpirun -np 100 getdp file.pro -solve DDM  
</syntaxhighlight></blockquote>
+
</source>
  
 
Sample scripts for SLURM and PBS schedulers are available:
 
Sample scripts for SLURM and PBS schedulers are available:

Revision as of 19:05, 6 November 2016

GetDDM combines GetDP and Gmsh to solve large scale finite element problems using optimized Schwarz domain decomposition methods.

Examples

Precompiled binaries

For demonstration purposes, download the serial pre-compiled ONELAB bundle for Windows64, Windows32, Linux64, Linux32 or MacOSX. With these precompiled binaries the examples will run in sequential mode. For parallel computations you need to recompile the codes from source with MPI support (see below).

Parallel version build

For parallel computations you need to compile GetDP and Gmsh with MPI support.

  • Install MPI, CMake and Subversion.
  • Uncompress the PETSc archive (in this example, using PETSc 3.4.4):
tar zxvf petsc-3.4.4.tar.gz
  • Configure and build PETSc. The configuration options depend on the calculations you want to perform (complex- or real-valued), as well as your compiler/MPI setup. Make sure that mpicc, mpicxx and mpif90 are in your path. Then run (remove --with-scalar-type=complex to build in real arithmetic):
cd petsc-3.4.4
export PETSC_DIR=$PWD
export PETSC_ARCH=mpi_mumps_complex
./configure --with-debugging=0 --with-clanguage=cxx --with-shared-libraries=0 --with-x=0  --download-mumps=1 --download-metis=1 --download-parmetis=1 --download-scalapack=1 --download-blacs=1 --with-scalar-type=complex
make
cd ..
  • Configure, compile and install a minimal Gmsh library (it will be used by GetDP):
cd gmsh-xxx
mkdir lib
cd lib
cmake -DDEFAULT=0 -DENABLE_PARSER=1 -DENABLE_POST=1 -DENABLE_BUILD_LIB=1 ..
make lib
sudo make install/fast
cd ../..
  • Configure and compile the MPI version of GetDP (change CC, CXX and FC depending on your MPI installation):
cd getdp-xxx
mkdir bin
cd bin
cmake -DENABLE_MPI=1 ..
make
cd ../..
  • [Optional - only for parallel mesh generation] Configure and compile the MPI version of Gmsh (need to disable Metis due to version clash with PETSc):
cd gmsh
mkdir bin
cd bin
cmake -DENABLE_MPI=1 -DENABLE_METIS=0 ..
make

Parallel runs

The commands for running GetDDM in parallel will depend on your particular MPI setup. Basically, the procedure would look something like this on 100 CPUs:

mpirun -np 100 gmsh file.geo -
mpirun -np 100 getdp file.pro -solve DDM 

Sample scripts for SLURM and PBS schedulers are available: [1], [2].

References

  • C. Geuzaine, B. Thierry, N. Marsic, D. Colignon, A. Vion, S. Tournier, Y. Boubendir, M. El Bouajaji, and X. Antoine. An Open Source Domain Decomposition Solver for Time-Harmonic Electromagnetic Wave Problems. 2014 IEEE International Conference on Antenna Measurements & Applications. November 16-19, Antibes Juan-les-Pins, France.