Difference between revisions of "GetDDM"
From ONELAB
(→Precompiled binaries) |
(→Precompiled binaries) |
||
(31 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
− | GetDDM (a ''General environment for the treatment of Domain Decomposition Methods'') combines [[GetDP]] and [[Gmsh]] to solve large scale finite element problems using optimized Schwarz domain decomposition methods. | + | GetDDM <!--(a ''General environment for the treatment of Domain Decomposition Methods'')--> combines [[GetDP]] and [[Gmsh]] to solve large scale finite element problems using optimized Schwarz domain decomposition methods. |
== Examples == | == Examples == | ||
− | * [[Domain | + | * [[Domain decomposition methods for waves|Non-overlapping DDM for time-harmonic waves (Helmholtz and Maxwell)]] |
− | |||
== Precompiled binaries == | == Precompiled binaries == | ||
− | For demonstration purposes, | + | For demonstration purposes, download the serial pre-compiled ONELAB bundle for [http://onelab.info/files/gmsh-getdp-Windows64.zip Windows] ([http://onelab.info/files/onelab-Windows32.zip 32 bit]), [http://onelab.info/files/gmsh-getdp-Linux64.zip Linux] or [http://onelab.info/files/gmsh-getdp-MacOSX.zip MacOS]. With these precompiled binaries the examples will run in sequential mode. For parallel computations you need to recompile the codes from source with MPI support (see below). |
− | |||
− | With these precompiled binaries the examples will run in sequential mode. For parallel computations you need to recompile the codes from source with MPI support (see below). | ||
== Parallel version build == | == Parallel version build == | ||
− | For parallel computations you need compile GetDP and Gmsh with MPI support. | + | For parallel computations you need to compile GetDP and Gmsh with MPI support. |
− | * Install MPI | + | * Install MPI and CMake. |
− | * Download PETSc from http://www.mcs.anl.gov/petsc/petsc-as/download/. PETSc 3.4 and 3. | + | * Download PETSc from http://www.mcs.anl.gov/petsc/petsc-as/download/. (PETSc 3.4, 3.5, 3.6 and 3.7 have been tested.) |
− | * Uncompress the PETSc archive (in this example, using PETSc 3. | + | * Uncompress the PETSc archive (in this example, using PETSc 3.7.5): |
− | < | + | <source> |
− | tar zxvf petsc-3. | + | tar zxvf petsc-3.7.5.tar.gz |
− | </ | + | </source> |
− | * Configure and build PETSc. The configuration options depend on the calculations you want to perform (complex- or real-valued), as well as your compiler/MPI setup. Make sure that '''mpicc''', '''mpicxx''' and '''mpif90''' are in your path. Then run: | + | * Configure and build PETSc. The configuration options depend on the calculations you want to perform (complex- or real-valued), as well as your compiler/MPI/Blas/Lapack setup. Make sure that '''mpicc''', '''mpicxx''' and '''mpif90''' are in your path. Then run: |
− | < | + | <source> |
− | cd petsc-3. | + | cd petsc-3.7.5 |
export PETSC_DIR=$PWD | export PETSC_DIR=$PWD | ||
export PETSC_ARCH=mpi_mumps_complex | export PETSC_ARCH=mpi_mumps_complex | ||
+ | # Notes: | ||
+ | # * Remove option --with-scalar-type=complex to build in real arithmetic | ||
+ | # * Use option --download-fblaslapack=1 if you don't have optimized bias/lapack libraries available on your system | ||
./configure --with-debugging=0 --with-clanguage=cxx --with-shared-libraries=0 --with-x=0 --download-mumps=1 --download-metis=1 --download-parmetis=1 --download-scalapack=1 --download-blacs=1 --with-scalar-type=complex | ./configure --with-debugging=0 --with-clanguage=cxx --with-shared-libraries=0 --with-x=0 --download-mumps=1 --download-metis=1 --download-parmetis=1 --download-scalapack=1 --download-blacs=1 --with-scalar-type=complex | ||
make | make | ||
cd .. | cd .. | ||
− | </ | + | </source> |
− | |||
− | * Download the Gmsh and the GetDP source code | + | * Download and unzip the Gmsh and the GetDP source code from the ONELAB bundle : http://onelab.info/files/gmsh-getdp-source.zip. Alternatively, you can download the last development version of Gmsh and GetDP from the source repositories. See http://getdp.info and http://gmsh.info for more information. |
− | < | + | <source> |
− | svn co https:// | + | svn co https://onelab.info/svn/gmsh/trunk gmsh --username gmsh # the password is the same as the username |
− | svn co https:// | + | svn co https://onelab.info/svn/getdp/trunk getdp --username getdp # the password is the same as the username |
− | </ | + | </source> |
* Configure, compile and install a minimal Gmsh library (it will be used by GetDP): | * Configure, compile and install a minimal Gmsh library (it will be used by GetDP): | ||
− | < | + | <source> |
− | cd gmsh | + | cd gmsh-xxx |
mkdir lib | mkdir lib | ||
cd lib | cd lib | ||
cmake -DDEFAULT=0 -DENABLE_PARSER=1 -DENABLE_POST=1 -DENABLE_BUILD_LIB=1 .. | cmake -DDEFAULT=0 -DENABLE_PARSER=1 -DENABLE_POST=1 -DENABLE_BUILD_LIB=1 .. | ||
make lib | make lib | ||
− | sudo make install/fast | + | sudo make install/fast # or 'make DESTDIR=$HOME/install-path-gmsh/ install/fast' if you do not have root access |
cd ../.. | cd ../.. | ||
− | </ | + | </source> |
* Configure and compile the MPI version of GetDP (change CC, CXX and FC depending on your MPI installation): | * Configure and compile the MPI version of GetDP (change CC, CXX and FC depending on your MPI installation): | ||
− | < | + | <source> |
− | cd getdp | + | cd getdp-xxx |
mkdir bin | mkdir bin | ||
cd bin | cd bin | ||
cmake -DENABLE_MPI=1 .. | cmake -DENABLE_MPI=1 .. | ||
+ | # Notes : | ||
+ | # * use option -DCMAKE_PREFIX_PATH=non-standard-install-path if you libraries installed in non-standard locations | ||
+ | # * use options -DPETSC_DIR=... -DPETSC_ARCH=... if the corresponding environment variables are not set properly | ||
make | make | ||
cd ../.. | cd ../.. | ||
− | </ | + | </source> |
− | * Configure and compile the MPI version of Gmsh (need to disable Metis due to version clash with PETSc): | + | * [''Optional - only for parallel mesh generation''] Configure and compile the MPI version of Gmsh (need to disable Metis due to version clash with PETSc): |
− | < | + | <source> |
cd gmsh | cd gmsh | ||
mkdir bin | mkdir bin | ||
Line 70: | Line 72: | ||
cmake -DENABLE_MPI=1 -DENABLE_METIS=0 .. | cmake -DENABLE_MPI=1 -DENABLE_METIS=0 .. | ||
make | make | ||
− | </ | + | </source> |
+ | |||
+ | == Parallel runs == | ||
+ | |||
+ | The commands for running GetDDM in parallel will depend on your particular MPI setup. Basically, the procedure would look something like this on 100 CPUs: | ||
+ | <source> | ||
+ | mpirun -np 100 gmsh file.geo - | ||
+ | mpirun -np 100 getdp file.pro -solve DDM | ||
+ | </source> | ||
+ | |||
+ | Sample scripts for SLURM and PBS schedulers are available: | ||
+ | [http://onelab.info/files/ddm_waves/run_slurm.sh], | ||
+ | [http://onelab.info/files/ddm_waves/run_pbs.sh]. | ||
== References == | == References == | ||
− | + | * C. Geuzaine, B. Thierry, N. Marsic, D. Colignon, A. Vion, S. Tournier, Y. Boubendir, M. El Bouajaji, and X. Antoine. An Open Source Domain Decomposition Solver for Time-Harmonic Electromagnetic Wave Problems. 2014 IEEE International Conference on Antenna Measurements & Applications. November 16-19, Antibes Juan-les-Pins, France. | |
+ | |||
+ | * B. Thierry, A.Vion, S. Tournier, M. El Bouajaji, D. Colignon, N. Marsic, X. Antoine, C. Geuzaine. [http://www.montefiore.ulg.ac.be/~geuzaine/preprints/getddm_preprint.pdf GetDDM: an Open Framework for Testing Optimized Schwarz Methods for Time-Harmonic Wave Problems]. 2015. |
Latest revision as of 12:20, 13 May 2017
GetDDM combines GetDP and Gmsh to solve large scale finite element problems using optimized Schwarz domain decomposition methods.
Examples
Precompiled binaries
For demonstration purposes, download the serial pre-compiled ONELAB bundle for Windows (32 bit), Linux or MacOS. With these precompiled binaries the examples will run in sequential mode. For parallel computations you need to recompile the codes from source with MPI support (see below).
Parallel version build
For parallel computations you need to compile GetDP and Gmsh with MPI support.
- Install MPI and CMake.
- Download PETSc from http://www.mcs.anl.gov/petsc/petsc-as/download/. (PETSc 3.4, 3.5, 3.6 and 3.7 have been tested.)
- Uncompress the PETSc archive (in this example, using PETSc 3.7.5):
tar zxvf petsc-3.7.5.tar.gz
- Configure and build PETSc. The configuration options depend on the calculations you want to perform (complex- or real-valued), as well as your compiler/MPI/Blas/Lapack setup. Make sure that mpicc, mpicxx and mpif90 are in your path. Then run:
cd petsc-3.7.5 export PETSC_DIR=$PWD export PETSC_ARCH=mpi_mumps_complex # Notes: # * Remove option --with-scalar-type=complex to build in real arithmetic # * Use option --download-fblaslapack=1 if you don't have optimized bias/lapack libraries available on your system ./configure --with-debugging=0 --with-clanguage=cxx --with-shared-libraries=0 --with-x=0 --download-mumps=1 --download-metis=1 --download-parmetis=1 --download-scalapack=1 --download-blacs=1 --with-scalar-type=complex make cd ..
- Download and unzip the Gmsh and the GetDP source code from the ONELAB bundle : http://onelab.info/files/gmsh-getdp-source.zip. Alternatively, you can download the last development version of Gmsh and GetDP from the source repositories. See http://getdp.info and http://gmsh.info for more information.
svn co https://onelab.info/svn/gmsh/trunk gmsh --username gmsh # the password is the same as the username svn co https://onelab.info/svn/getdp/trunk getdp --username getdp # the password is the same as the username
- Configure, compile and install a minimal Gmsh library (it will be used by GetDP):
cd gmsh-xxx mkdir lib cd lib cmake -DDEFAULT=0 -DENABLE_PARSER=1 -DENABLE_POST=1 -DENABLE_BUILD_LIB=1 .. make lib sudo make install/fast # or 'make DESTDIR=$HOME/install-path-gmsh/ install/fast' if you do not have root access cd ../..
- Configure and compile the MPI version of GetDP (change CC, CXX and FC depending on your MPI installation):
cd getdp-xxx mkdir bin cd bin cmake -DENABLE_MPI=1 .. # Notes : # * use option -DCMAKE_PREFIX_PATH=non-standard-install-path if you libraries installed in non-standard locations # * use options -DPETSC_DIR=... -DPETSC_ARCH=... if the corresponding environment variables are not set properly make cd ../..
- [Optional - only for parallel mesh generation] Configure and compile the MPI version of Gmsh (need to disable Metis due to version clash with PETSc):
cd gmsh mkdir bin cd bin cmake -DENABLE_MPI=1 -DENABLE_METIS=0 .. make
Parallel runs
The commands for running GetDDM in parallel will depend on your particular MPI setup. Basically, the procedure would look something like this on 100 CPUs:
mpirun -np 100 gmsh file.geo - mpirun -np 100 getdp file.pro -solve DDM
Sample scripts for SLURM and PBS schedulers are available: [1], [2].
References
- C. Geuzaine, B. Thierry, N. Marsic, D. Colignon, A. Vion, S. Tournier, Y. Boubendir, M. El Bouajaji, and X. Antoine. An Open Source Domain Decomposition Solver for Time-Harmonic Electromagnetic Wave Problems. 2014 IEEE International Conference on Antenna Measurements & Applications. November 16-19, Antibes Juan-les-Pins, France.
- B. Thierry, A.Vion, S. Tournier, M. El Bouajaji, D. Colignon, N. Marsic, X. Antoine, C. Geuzaine. GetDDM: an Open Framework for Testing Optimized Schwarz Methods for Time-Harmonic Wave Problems. 2015.