-
Notifications
You must be signed in to change notification settings - Fork 5
Home
The following configuring scripts are assuming: hdf5, netcdf, and after-installing of petsc are installed at /usr/local/; cmake, zlib, mpi are under /usr/. If not in your system, please modify accordingly.
#!/bin/sh -v
# usage: copy this script to where petsc cloned or source directory, and run it there.
PATH=/usr/local/bin:/usr/bin:/bin
PACKAGE_ROOT=/usr/local/
HDF5_CURRENT=hdf5-1.10-parallel
NC4_CURRENT=netcdf-4.x-hdf5-parallel
MPI_DIR=/usr/
HDF5_DIR=$PACKAGE_ROOT/$HDF5_CURRENT
NC4_DIR=$PACKAGE_ROOT/$NC4_CURRENT
echo $MPI_DIR
echo $HDF5_DIR
BLASLAPACK_LIB_DIR=/usr/lib/x86_64-linux-gnu
CMAKE_DIR=/usr
ZLIB_DIR_LIB=/usr/lib/x86_64-linux-gnu
ZLIB_DIR_INC=/usr/include
PETSC_SOURCE_DIR=${PWD}
INSTALL_DIR=$PACKAGE_ROOT/petsc-x-noopt
PETSC_ARCH=arch-linux-noopt
## so run this shell script under petsc source direcotry
cd ./
./configure \
--prefix=$INSTALL_DIR \
--with-clean=1 --with-c2html=0 --with-x=0 \
--with-ssl=0 --with-debugging=0 --with-valgrind=0 \
--with--cxx-dialect=C++11 \
--with-shared-libraries=1 --with-debugging=0 --with-precision=double \
--with-index-size=32 --with-memalign=16 --with-64-bit-indices=0 \
--with-mpi-dir=$MPI_DIR --known-mpi-shared-libraries=0 --with-mpi=1 \
--with-blas-lapack-dir=$BLASLAPACK_LIB_DIR \
--with-zlib-lib=$ZLIB_DIR_LIB/libz.so --with-zlib-include=$ZLIB_DIR_INC \
--with-cmake-dir=$CMAKE_DIR \
--with-hdf5-dir=$HDF5_DIR \
--download-sowing=yes \
--download-metis=yes \
--download-parmetis=yes \
--download-mumps=yes \
--download-scalapack=yes \
--download-superlu=yes \
--download-supperlu-dist=yes \
--download-hypre=yes \
LIBS=" -L$ZLIB_DIR_LIB -lz -lm" \
PETSC_DIR=$PETSC_SOURCE_DIR \
PETSC_ARCH=$PETSC_ARCH \
COPTFLAGS=" -fPIC -O0 " \
FCOPTFLAGS="-fPIC -O0 " \
CXXOPTFLAGS=" -fPIC -O0 " \
FOPTFLAGS=" -O0 "
When above 'configure' step is done successfully, it will be showing a line like: make PETSC_DIR= ............ all
. Then copy this command line and paste to run it.
If no error occurs in building above, it will again be showing a line like: make PETSC_DIR= ....... install
. Following that hint, copy the command line and paste to run it. The PETSc will be installed in directory: /usr/local/petsc-x-noopt/
, which is PETSC_DIR pointed to.
NOTE:
Since some time in 2021, PETSc installation would not include hdf5-fortran by default. And hdf5-fortran is the default i/o libraries for PFLOTRAN, we MUST manually add two libraries into PETSc variables, like following:
vi /usr/local/petsc-x-noopt/lib/petsc/conf/petscvariables
OR, likely, sudo vi /usr/local/petsc-x-noopt/lib/petsc/conf/petscvariables
After open that text file, insert two hdf5 fortran libraries in 3 lines, in which ' -lhdf5_hl -lhdf5' are contained, so that it becomes -lhdf5hl_fortran -lhdf5_fortran -lhdf5_hl -lhdf5
. Those 3 lines are headered by 'HDF5_LIB = ', 'PETSC_EXTERNAL_LIB_BASIC =', 'PETSC_WITH_EXTERNAL_LIB = ', respectively. NOTE: depending upon your hdf5 version, ' -lhdf5hl_fortran' should be -lhdf5_hl_fortran
.
If with openmpi, built from gcc. There is a linker error with mpifort (mpif90), i.e. fortran wrapper. So the scripts (zsh) is as following:
#!/bin/zsh -v
# usage: copy this script to where petsc cloned or source directory, and run it there.
PACKAGE_ROOT=/usr/local/gcc-x/openmpi-x-gcc
HDF5_CURRENT=hdf5-1.12-parallel
NC4_CURRENT=netcdf-4.x-hdf5-parallel
MPI_DIR=$PACKAGE_ROOT/openmpi-4.x
HDF5_DIR=$PACKAGE_ROOT/$HDF5_CURRENT
NC4_DIR=$PACKAGE_ROOT/$NC4_CURRENT
echo $MPI_DIR
echo $HDF5_DIR
BLASLAPACK_LIB_DIR=/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr
CMAKE_DIR=/Applications/CMake.app/Contents
#petsc requires zlib-1.3
#ZLIB_DIR=/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr
#ZLIB_DIR_LIB=/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/lib
#ZLIB_DIR_INC=/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/include
PETSC_SOURCE_DIR=${PWD}
INSTALL_DIR=$PACKAGE_ROOT/petsc-x-noopt
PETSC_ARCH=aarch64-apple-darwin-x
## so run this shell script under petsc source direcotry
cd ./
./configure \
--prefix=$INSTALL_DIR \
--with-clean=0 --with-c2html=0 --with-x=0 \
--with-ssl=0 --with-debugging=0 --with-valgrind=0 \
--with--cxx-dialect=C++17 \
--with-shared-libraries=1 --with-debugging=0 --with-precision=double \
--with-memalign=32 --with-64-bit-indices=0 \
--with-mpi-dir=$MPI_DIR --known-mpi-shared-libraries=0 --with-mpi=1 \
--with-blaslapack-dir=$BLASLAPACK_LIB_DIR \
--with-cmake-dir=$CMAKE_DIR \
--with-hdf5-dir=$HDF5_DIR \
--packages-download-dir=/Users/f9y/devtools \
--download-zlib=yes \
--download-sowing=yes \
--download-metis=yes \
--download-parmetis=yes \
--download-mumps=/Users/f9y/devtools/MUMPS_5.6.2.tar.gz \
--download-scalapack=yes \
--download-superlu=yes \
--download-supperlu-dist=yes \
--download-hypre=yes \
PETSC_DIR=$PETSC_SOURCE_DIR \
PETSC_ARCH=$PETSC_ARCH \
CPPFLAGS=" -I/$MPI_DIR/include" \
COPTFLAGS=" -fPIC -O0 -I$MPI_DIR/include" \
FCOPTFLAGS="-fPIC -O0 -I$MPI_DIR/include" \
CXXOPTFLAGS=" -fPIC -O0 -I$MPI_DIR/include" \
FOPTFLAGS=" -fPIC -O0 -I$MPI_DIR/include " \
LDFLAGS=" -Wl,-ld_classic -L$MPI_DIR/lib"
NOTE: (1) that LDFLAGS "–Wl,-ld_classic"
is a must for openmpi working on Apple arm64.
(2) when building PFLOTRAN with this PETSc, need to add into its makefile
a line like the following:
MYFLAGS += ${FC_DEFINE_FLAG}PETSC_HAVE_FORTRAN_GET_COMMAND_ARGUMENT
(COMING)
if not coupled with ELM, this repository should be a stand-alone PFLOTRAN
model
cd ./src/pflotran
make PETSC_DIR=$PETSC_DIR th_characteristic_curves=TRUE pflotran
(where $PETSC_DIR is your PETSC_DIR directory)
make PETSC_DIR=$PETSC_DIR test
(where $PETSC_DIR is your PETSC_DIR directory)
if coupling ELM with PFLOTRAN, need to build a library named as libpflotran.a.
cd ./src/pflotran-elm-interface
FIRST, run makefile to copy or link needed PFLOTRAN source code files (*.F90), if PFLOTRAN codes missing. (OPTIONAL)
make copy_common_src
(OR, make link_common_src
to softlink needed PFLOTRAN source codes)
(OR, make clean_common_src
to clean PFLOTRAN source code files, and only leave CLM-PFLOTRAN interface codes, if needed)
SECONDLY, build the library
make PETSC_DIR=$PETSC_DIR column_mode=TRUE libpflotran.a
(OR, make PETSC_DIR=$PETSC_DIR column_mode=TRUE debugbuild=TRUE libpflotran.a
for a library with '-g -O0' then built codes can be debugged)
FINALLY, build ELM v1.1 or above with this library, as usual, BUT must do modifying ELM's makefile or Macro.make as following.
I. Macro.make modified for coupling build -
ifeq ($(MODEL), clm)
FFLAGS := $(FFLAGS) $(PFLOTRAN_INC)
endif
......
ifeq ($(MODEL), driver)
LDFLAGS := $(LDFLAGS) $(PFLOTRAN_LIB)
endif
NOTE: Modified Macro above requires 2 alias
II. Macro.cmake (E3SM master since 2019-07)
if("${MODEL}" STREQUAL "clm")
set(FFLAGS "${FFLAGS} $(PFLOTRAN_INC)")
endif()
......
if("${MODEL}" STREQUAL "driver")
......
set(LDFLAGS "${LDFLAGS} $(PFLOTRAN_LIB)")
endif()
III. config_machines.xml editing FFLAGS
and LDFLAGS
for all or specific compilers. NOTE: if this added, No need to modify 'Macro' or 'Macro.make' under case directory.
......
<FFLAGS>
<!-- A NOTE here: $(PFLOTRAN_INC) may contain both PETSC and actual PFLOTRAN include dir, or only PETSC include dir -->
<append MODEL="clm"> $(PFLOTRAN_INC) </append>
</FFLAGS>
......
<LDFLAGS>
<!-- A NOTE here: $(PFLOTRAN_LIB) may contain both PETSC libraries and actual PFLOTRAN libray, or only PETSC libraries -->
<append MODEL="driver"> $(PFLOTRAN_LIB) </append>
</LDFLAGS>
IV. config_machines.xml editing for each supported machine (example CADES at ORNL). NOTE: IF NOT, after './case.setup', edit 'env_mach_specific.xml' to turn on options.
<!-- for CLM-PFLOTRAN coupling, the PETSC_PATH must be defined specifically upon machines, usually defined in .bashrc -->
<!-- the following is PETSc v.3.8.x or above -->
<environment_variables compiler="gnu" mpilib="openmpi">
<env name="PETSC_PATH">/software/user_tools/current/cades-ccsi/petsc-x/openmpi-1.10-gcc-5.3</env> <!-- PETSc v3.8.x or above -->
</environment_variables>
<!-- hack for PFLOTRAN coupling to build ELM model.
this is a temporary solution, and user must manually edit the following
in 'env_mach_specific.xml' after case.setup,
Otherwise, model will build/run as non-PFLOTRAN coupled.
-->
<environment_variables>
<!-- The following pflotran is with PETSc-v3.8.x or above on CADES-->
<env name="CLM_PFLOTRAN_SOURCE_DIR">/lustre/or-hydra/cades-ccsi/proj-shared/models/pflotran-dev/src/pflotran-elm-interface</env>
<!-- by blanking the following 2 names, PETSC libs excluded in e3sm.exe when NOT coupling with PFLOTRAN -->
<env name="PFLOTRAN_INC"> -I$ENV{CLM_PFLOTRAN_SOURCE_DIR} -I$ENV{PETSC_DIR}/include</env>
<env name="PFLOTRAN_LIB"> -L$ENV{CLM_PFLOTRAN_SOURCE_DIR} -lpflotran -L$ENV{PETSC_DIR}/lib -lpetsc -lmetis -lparmetis</env>
</environment_variables>
NOTE: You must be sure that $CLM_PFLOTRAN_SOURCE_DIR and $PETSC_PATH are defined in your bash environment setting. Of course the libpflotran.a
are prebuilt as mentioned above. If you DON'T want to include this library in your e3sm.exe (and of course no use of coupled models ), edit 'env_mach_specific.xml' as following:
<environment_variables>
<!--
<env name="CLM_PFLOTRAN_SOURCE_DIR">/lustre/or-hydra/cades-ccsi/proj-shared/models/pflotran-dev/src/pflotran-elm-interface</env>
<env name="PFLOTRAN_INC"> -I$ENV{CLM_PFLOTRAN_SOURCE_DIR} -I$ENV{PETSC_PATH}/include</env>
<env name="PFLOTRAN_LIB"> -L$ENV{CLM_PFLOTRAN_SOURCE_DIR} -lpflotran -L$ENV{PETSC_PATH}/lib -lpetsc -lmetis -lparmetis</env>
-->
</environment_variables>
UPDATED: 2024-04-26