Extending PETSc with Robust Overlapping Schwarz Preconditioners and Advanced Krylov Methods.
The code available in this repository can reproduce the results from the following paper.
@article{jolivet2020petsc,
Author = {Jolivet, Pierre and Roman, Jose E. and Zampini, Stefano},
Title = {{KSPHPDDM} and {PCHPDDM}: Extending {PETSc} with advanced {Krylov} methods and robust multilevel overlapping {Schwarz} preconditioners},
Year = {2021},
Publisher = {Elsevier},
Journal = {Computer \& Mathematics with Applications},
Volume = {84},
Pages = {277--295},
Url = {https://github.com/prj-/jolivet2020petsc}
}
Make sure you have access to a recent FreeFEM and/or MFEM installation, compiled with PETSc and SLEPc support. More details about the HPDDM options used in the solver may be found in the KSP or the PC manual pages.
One should be able to launch the following commands, which solves the Bratu equation or computes the eigenvalues of the Laplacian on the same geometrical configurations as in the paper.
$ mpirun -np 8 FreeFem++-mpi bratu.edp -v 0
$ mpirun -np 8 FreeFem++-mpi blocking-slepc.edp -v 0 -asm
The option -v 0
is here to minimize the output generated by FreeFEM, see this tutorial for more information.
Here are the main command line parameters common to all scripts.
-N
(default to10
), number of discretization points of the global domain (users are free to use a Gmsh or DMPlex mesh as well)
For blocking-slepc.edp
, default options for the three inner preconditioners compared in the paper are turned on with the additional command line parameters -asm
(for PCASM
), -gamg
(for PCGAMG
), or -hpddm
(for PCHPDDM
). It is also possible to switch from EPSLOBPCG
to EPSCISS
by using the command line option -eps_type ciss
.
The source code of the mini-app is in the file MatProduct.c
. It is also available as mat/tests/ex237.c in the official PETSc test suite. It can be compiled using a recent PETSc installation (3.14.0 or above) and launched using the same parameters as in the paper. One can generate one's own MatSeqAIJ
and then save it in binary format, or download the matrix used in the benchmark: binaryoutput.
- HPC resources of TGCC@CEA, resp. IDRIS@CNRS, under the allocation A0070607519, resp. AP010611780, made by GENCI