Jump to: navigation, search

GetDDM combines GetDP and Gmsh to solve large scale finite element problems using optimized Schwarz domain decomposition methods.


Precompiled binaries

For demonstration purposes, download the serial pre-compiled ONELAB bundle for Windows (32 bit), Linux or MacOS. With these precompiled binaries the examples will run in sequential mode. For parallel computations you need to recompile the codes from source with MPI support (see below).

Parallel version build

For parallel computations you need to compile GetDP and Gmsh with MPI support.

  • Install MPI and CMake.
  • Uncompress the PETSc archive (in this example, using PETSc 3.7.5):
tar zxvf petsc-3.7.5.tar.gz
  • Configure and build PETSc. The configuration options depend on the calculations you want to perform (complex- or real-valued), as well as your compiler/MPI/Blas/Lapack setup. Make sure that mpicc, mpicxx and mpif90 are in your path. Then run:
cd petsc-3.7.5
export PETSC_ARCH=mpi_mumps_complex
# Notes:
# * Remove option --with-scalar-type=complex to build in real arithmetic
# * Use option --download-fblaslapack=1 if you don't have optimized bias/lapack libraries available on your system
./configure --with-debugging=0 --with-clanguage=cxx --with-shared-libraries=0 --with-x=0  --download-mumps=1 --download-metis=1 --download-parmetis=1 --download-scalapack=1 --download-blacs=1 --with-scalar-type=complex
cd ..
svn co gmsh --username gmsh # the password is the same as the username
svn co getdp --username getdp # the password is the same as the username
  • Configure, compile and install a minimal Gmsh library (it will be used by GetDP):
cd gmsh-xxx
mkdir lib
cd lib
make lib
sudo make install/fast # or 'make DESTDIR=$HOME/install-path-gmsh/ install/fast' if you do not have root access
cd ../..
  • Configure and compile the MPI version of GetDP (change CC, CXX and FC depending on your MPI installation):
cd getdp-xxx
mkdir bin
cd bin
cmake -DENABLE_MPI=1 ..
# Notes :
# * use option -DCMAKE_PREFIX_PATH=non-standard-install-path if you libraries installed in non-standard locations
# * use options -DPETSC_DIR=... -DPETSC_ARCH=... if the corresponding environment variables are not set properly
cd ../..
  • [Optional - only for parallel mesh generation] Configure and compile the MPI version of Gmsh (need to disable Metis due to version clash with PETSc):
cd gmsh
mkdir bin
cd bin

Parallel runs

The commands for running GetDDM in parallel will depend on your particular MPI setup. Basically, the procedure would look something like this on 100 CPUs:

mpirun -np 100 gmsh file.geo -
mpirun -np 100 getdp -solve DDM 

Sample scripts for SLURM and PBS schedulers are available: [1], [2].


  • C. Geuzaine, B. Thierry, N. Marsic, D. Colignon, A. Vion, S. Tournier, Y. Boubendir, M. El Bouajaji, and X. Antoine. An Open Source Domain Decomposition Solver for Time-Harmonic Electromagnetic Wave Problems. 2014 IEEE International Conference on Antenna Measurements & Applications. November 16-19, Antibes Juan-les-Pins, France.