Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Compile elmerFEM natively for apple silicon? #503

Open
ghostforest opened this issue Jul 23, 2024 · 17 comments
Open

Compile elmerFEM natively for apple silicon? #503

ghostforest opened this issue Jul 23, 2024 · 17 comments

Comments

@ghostforest
Copy link

Is elmer supposed to compile natively for M1 M2 M3 macs? I did not get it working. I switched to rosetta environment for compilation.

This is what worked:

arch -x86_64 /usr/bin/env bash --login

cmake -DCMAKE_C_COMPILER=/usr/local/Cellar/gcc/14.1.0_2/bin/gcc-14
-DCMAKE_CXX_COMPILER=/usr/local/Cellar/gcc/14.1.0_2/bin/g++-14
-DCMAKE_Fortran_COMPILER=/usr/local/Cellar/gcc/14.1.0_2/bin/gfortran
-DWITH_ELMERGUI=TRUE
-DWITH_OpenMP=TRUE
-DWITH_MPI=TRUE
-DMPI_C_COMPILER=/usr/local/opt/open-mpi/bin/mpicc
-DMPI_CXX_COMPILER=/usr/local/opt/open-mpi/bin/mpicxx
-DMPI_Fortran_COMPILER=/usr/local/opt/open-mpi/bin/mpifort
-DWITH_QT5=TRUE
-DWITH_QWT=FALSE
-DQWT_INCLUDE_DIR=/usr/local/opt/qwt/lib/qwt.framework/Headers
-DQWT_LIBRARY=/usr/local/opt/qwt/lib/qwt.framework/qwt
-DOCC_INCLUDE_DIR=$(brew --prefix opencascade)/include/opencascade
-DOCC_LIBRARY_DIR=$(brew --prefix opencascade)/lib
-DCMAKE_PREFIX_PATH=$(brew --prefix qt@5)
-DCMAKE_INSTALL_PREFIX=../install .. --log-level=DEBUG

make -j$(sysctl -n hw.logicalcpu)

make install


@ghostforest
Copy link
Author

Switching to clang for compiling elmer really means trouble. Problems with fortran, openmpi...

@jeffhammond
Copy link

There is no problem compiling Open MPI with Clang and you are free to combine Clang as the C and C++ compiler with gfortran as the Fortran compiler.

@jeffhammond
Copy link

Normally when people say something doesn't compile, they include the error messages and log files associated with said failure.

@ghostforest
Copy link
Author

ghostforest commented Jul 24, 2024

Maybe you can share the steps you take on compiling elmer then? I spent 4 hours yesterday and one solved problem led to another. First openmpi was not compatible, then openmpi was not found, then gfortran was not working, then umfpack. So I think its easier if you got it working to share how you did it?

cmake -DCMAKE_C_COMPILER=/usr/bin/clang \
      -DCMAKE_CXX_COMPILER=/usr/bin/clang++ \
      -DCMAKE_Fortran_COMPILER=/opt/homebrew/bin/gfortran \
      -DWITH_ELMERGUI=TRUE \
      -DWITH_OpenMP=TRUE \
      -DWITH_MPI=TRUE \
      -DMPI_C_COMPILER=/opt/homebrew/bin/mpicc \
      -DMPI_CXX_COMPILER=/opt/homebrew/bin/mpicxx \
      -DMPI_Fortran_COMPILER=/opt/homebrew/bin/mpifort \
      -DMPIEXEC=/opt/homebrew/bin/mpiexec \
      -DQWT_INCLUDE_DIR=/opt/homebrew/opt/qwt/lib/qwt.framework/Headers \
      -DQWT_LIBRARY=/opt/homebrew/opt/qwt/lib/qwt.framework/qwt \
      -DOCC_INCLUDE_DIR=$(brew --prefix opencascade)/include/opencascade \
      -DOCC_LIBRARY_DIR=$(brew --prefix opencascade)/lib \
      -DCMAKE_PREFIX_PATH=$(brew --prefix qt@5) \
      -DCMAKE_INSTALL_PREFIX=../install \
      -DOpenMP_C_FLAGS="-Xpreprocessor -fopenmp -I$(brew --prefix libomp)/include" \
      -DOpenMP_C_LIB_NAMES="omp" \
      -DOpenMP_CXX_FLAGS="-Xpreprocessor -fopenmp -I$(brew --prefix libomp)/include" \
      -DOpenMP_CXX_LIB_NAMES="omp" \
      -DOpenMP_omp_LIBRARY=$(brew --prefix libomp)/lib/libomp.dylib \
      .. --log-level=DEBUG

CMake Error at /opt/homebrew/Cellar/cmake/3.30.1/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:233 (message):
  Could NOT find MPI (missing: MPI_Fortran_FOUND) (found version "3.1")
Call Stack (most recent call first):
  /opt/homebrew/Cellar/cmake/3.30.1/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:603 (_FPHSA_FAILURE_MESSAGE)
  /opt/homebrew/Cellar/cmake/3.30.1/share/cmake/Modules/FindMPI.cmake:1841 (find_package_handle_standard_args)
  CMakeLists.txt:230 (FIND_PACKAGE)

This cant be the correct cmake command. Id suppose just typing cmake .. from the build directory would work but it does not.

@jeffhammond
Copy link

The MacOS build on M2 is almost trivial:

git clone --recursive https://github.com/ElmerCSC/elmerfem.git
cd elmerfem
mkdir build
cd build
MPI_HOME=/opt/homebrew/Cellar/open-mpi/5.0.3_1 CMAKE_INSTALL_PREFIX=$HOME/Work/Apps/ELMER/install CFLAGS="-std=gnu89" cmake ..
cmake --build .
cmake --install .
ctest

@jeffhammond
Copy link

You should not set OpenMP options on MacOS if you use Apple Clang. Apple Clang does not support OpenMP. Both Clang and GCC from Homebrew do, on the other hand.

@ghostforest
Copy link
Author

ghostforest commented Jul 24, 2024

Will it even compile with MPI without setting

-DWITH_OpenMP=TRUE \
-DWITH_MPI=TRUE \

?

Now I get this error when using your command:

12 warnings generated.
[  1%] Linking C shared library libmatc.dylib
[  1%] Built target matc
[  1%] Building C object matc/src/CMakeFiles/Matc_bin.dir/main.c.o
[  1%] Linking C executable matc
[  1%] Built target Matc_bin
[  1%] Built target umfpack_srcs
[  1%] Generating umfpack_zl_wsolve.c
In file included from /Users/User/Dev/elmerfem/umfpack/src/umfpack/umfpack_solve.c:31:
/Users/User/Dev/elmerfem/umfpack/src/umfpack/include/umf_internal.h:29:10: fatal error: 'string.h' file not found
#include <string.h>
         ^~~~~~~~~~
1 error generated.
make[2]: *** [umfpack/src/umfpack/umfpack_zl_wsolve.c] Error 1
make[2]: *** Deleting file `umfpack/src/umfpack/umfpack_zl_wsolve.c'
make[1]: *** [umfpack/src/umfpack/CMakeFiles/umfpack.dir/all] Error 2
make: *** [all] Error 2

@ghostforest
Copy link
Author

ghostforest commented Jul 24, 2024

But that is the command I used...

Id like to compile elmer with GUI and with OpenMPI

@jeffhammond
Copy link

Sometimes Apple breaks their C toolchain. If the C standard headers aren't found, try reinstalling XCode command-line tools.

@ghostforest
Copy link
Author

ghostforest commented Jul 24, 2024

I got it to seemingly correctly compile with:

MPI_HOME=/opt/homebrew/Cellar/open-mpi/5.0.3_1 \
cmake -DCMAKE_C_COMPILER=/opt/homebrew/opt/llvm/bin/clang \
      -DCMAKE_CXX_COMPILER=/opt/homebrew/opt/llvm/bin/clang++ \
      -DMPI_C_COMPILER=/opt/homebrew/opt/open-mpi/bin/mpicc \
      -DMPI_CXX_COMPILER=/opt/homebrew/opt/open-mpi/bin/mpicxx \
      -DMPI_Fortran_COMPILER=/opt/homebrew/opt/open-mpi/bin/mpifort \
      -DWITH_ELMERGUI=TRUE \
      -DWITH_QT5=TRUE \
      -DWITH_QWT=FALSE \
      -DWITH_OpenMP=TRUE \
      -DWITH_MPI=TRUE \
      -DQWT_INCLUDE_DIR=/usr/local/opt/qwt/lib/qwt.framework/Headers \
      -DQWT_LIBRARY=/usr/local/opt/qwt/lib/qwt.framework/qwt \
      -DOCC_INCLUDE_DIR=$(brew --prefix opencascade)/include/opencascade \
      -DOCC_LIBRARY_DIR=$(brew --prefix opencascade)/lib \
      -DCMAKE_PREFIX_PATH=$(brew --prefix qt@5) \
      -DCMAKE_C_FLAGS="-fopenmp" \
      -DCMAKE_CXX_FLAGS="-fopenmp" \
      -DCMAKE_EXE_LINKER_FLAGS="-L/opt/homebrew/opt/libomp/lib -lomp" \
      -DCMAKE_INSTALL_PREFIX=../install .. --log-level=DEBUG

however the shell setting seems to be crucial:

export PATH="/opt/homebrew/opt/open-mpi/bin:$PATH"
export LDFLAGS="-L/opt/homebrew/opt/open-mpi/lib $LDFLAGS"
export CPPFLAGS="-I/opt/homebrew/opt/open-mpi/include $CPPFLAGS"
export OMPI_CC=/opt/homebrew/opt/llvm/bin/clang
export OMPI_CXX=/opt/homebrew/opt/llvm/bin/clang++
export OMPI_FC=/usr/local/bin/gfortran

# OpenMP settings
export LD_LIBRARY_PATH=/opt/homebrew/opt/libomp/lib:$LD_LIBRARY_PATH
export DYLD_LIBRARY_PATH=/opt/homebrew/opt/libomp/lib:$DYLD_LIBRARY_PATH 

however
87% tests passed, 122 tests failed out of 922

The following tests FAILED:
	124 - CurvedBoundaryCylH_np3 (Failed)
	125 - CurvedBoundaryCylH_np8 (Failed)
	243 - FixTangentVelo (Failed)
	248 - H1BasisEvaluation (Failed)
	249 - HarmonicNS (Failed)
	264 - HelmholtzFEM (Failed)
	271 - HelmholtzStructure (Failed)
	272 - HelmholtzStructure2 (Failed)
	273 - HelmholtzStructure3 (Failed)
	275 - Hybrid2dMeshPartitionMetis_np8 (Failed)
	276 - Hybrid2dMeshPartitionMetisConnect_np8 (Failed)
	278 - HydrostaticNSVec-ISMIP-HOM-C (Failed)
	285 - InductionHeating2 (Failed)
	286 - InductionHeating3 (Failed)
	339 - MazeMeshPartitionMetisContig_np6 (Failed)
	355 - ModelPDEthreaded (Failed)
	357 - MonolithicSlave2 (Failed)
	381 - NonnewtonianChannelFlow_vec (Failed)
	430 - PlatesEigenComplex (Failed)
	431 - PlatesHarmonic (Failed)
	503 - SD_H1BasisEvaluation (Failed)
	504 - SD_HarmonicNS (Failed)
	505 - SD_LinearFormsAssembly (Failed)
	511 - SD_NonnewtonianChannelFlow_vec (Failed)
	525 - SD_Step_stokes_heat_vec (Failed)
	526 - SD_Step_stokes_vec (Failed)
	527 - SD_Step_stokes_vec_blockprec (Failed)
	567 - Shell_with_Solid_Beam_EigenComplex (Failed)
	573 - ShoeboxFsiHarmonicPlate (Failed)
	574 - ShoeboxFsiStatic (Failed)
	576 - ShoeboxFsiStaticShell (Failed)
	587 - StatCurrentVec2 (Failed)
	600 - Step_stokes_heat_vec (Failed)
	601 - Step_stokes_vec (Failed)
	602 - Step_stokes_vec_blockprec (Failed)
	611 - StressConstraintModes3 (Failed)
	616 - TEAM30a_3ph_transient (Failed)
	637 - VectorHelmholtzImpMatrix (Failed)
	638 - VectorHelmholtzWaveguide (Failed)
	639 - VectorHelmholtzWaveguide2 (Failed)
	640 - VectorHelmholtzWaveguide3 (Failed)
	641 - VectorHelmholtzWaveguide4 (Failed)
	643 - VectorHelmholtzWaveguideNodal (Failed)
	644 - VectorHelmholtzWaveguideQuadBlock (Failed)
	645 - VectorHelmholtzWaveguide_TM (Failed)
	655 - WinkelPartitionMetis_np8 (Failed)
	656 - WinkelPartitionMetisConnect_np8 (Failed)
	657 - WinkelPartitionMetisRec_np8 (Failed)
	673 - Zirka (Failed)
	692 - circuits2D_harmonic_foil (Failed)
	693 - circuits2D_harmonic_london (Failed)
	694 - circuits2D_harmonic_massive (Failed)
	695 - circuits2D_harmonic_stranded (Failed)
	696 - circuits2D_harmonic_stranded_explicit_coil_resistance (Failed)
	697 - circuits2D_harmonic_stranded_homogenization (Failed)
	698 - circuits2D_scan_harmonics (Failed)
	699 - circuits2D_transient_foil (Failed)
	700 - circuits2D_transient_london (Failed)
	701 - circuits2D_transient_massive (Failed)
	702 - circuits2D_transient_stranded (Failed)
	703 - circuits2D_transient_variable_resistor (Failed)
	704 - circuits2D_with_hysteresis (Failed)
	705 - circuits2D_with_hysteresis_axi (Failed)
	706 - circuits_harmonic_foil (Failed)
	707 - circuits_harmonic_foil_anl_rotm (Failed)
	708 - circuits_harmonic_foil_wvector (Failed)
	709 - circuits_harmonic_homogenization_coil_solver (Failed)
	710 - circuits_harmonic_massive (Failed)
	711 - circuits_harmonic_stranded (Failed)
	712 - circuits_harmonic_stranded_homogenization (Failed)
	742 - freesurf_maxd_np4 (Failed)
	745 - freesurf_maxd_local_np4 (Failed)
	770 - linearsolvers_cmplx (Failed)
	772 - mgdyn2D_compute_average_b (Failed)
	773 - mgdyn2D_compute_bodycurrent (Failed)
	774 - mgdyn2D_compute_complex_power (Failed)
	775 - mgdyn2D_em (Failed)
	776 - mgdyn2D_em_conforming (Failed)
	777 - mgdyn2D_em_harmonic (Failed)
	778 - mgdyn2D_harmonic_anisotropic_permeability (Failed)
	779 - mgdyn2D_pm (Failed)
	780 - mgdyn2D_scan_homogenization_elementary_solutions (Failed)
	781 - mgdyn2D_steady_wire (Failed)
	782 - mgdyn_3phase (Failed)
	786 - mgdyn_airgap_force_np2 (Failed)
	787 - mgdyn_airgap_harmonic (Failed)
	803 - mgdyn_harmonic (Failed)
	804 - mgdyn_harmonic_loss (Failed)
	805 - mgdyn_harmonic_wire (Failed)
	806 - mgdyn_harmonic_wire_Cgauge (Failed)
	807 - mgdyn_harmonic_wire_Cgauge_automatic (Failed)
	808 - mgdyn_harmonic_wire_impedanceBC (Failed)
	809 - mgdyn_harmonic_wire_impedanceBC2 (Failed)
	811 - mgdyn_lamstack_lowfreq_harmonic (Failed)
	813 - mgdyn_lamstack_widefreq_harmonic (Failed)
	815 - mgdyn_nodalforce2d (Failed)
	818 - mgdyn_steady_coils (Failed)
	824 - mgdyn_steady_quad_extruded_restart (Failed)
	825 - mgdyn_steady_quad_extruded_restart_np3 (Failed)
	832 - mgdyn_thinsheet_harmonic (Failed)
	836 - mgdyn_torus_harmonic (Failed)
	839 - mgdyn_wave_eigen (Failed)
	840 - mydyn_wave_harmonic (Failed)
	866 - pointload2 (Failed)
	872 - radiation (Failed)
	873 - radiation2 (Failed)
	874 - radiation2d (Failed)
	875 - radiation2dAA (Failed)
	876 - radiation2d_deform (Failed)
	877 - radiation2d_spectral (Failed)
	878 - radiation2dsymm (Failed)
	879 - radiation3d (Failed)
	880 - radiation_bin (Failed)
	881 - radiation_dg (Failed)
	882 - radiator2d (Failed)
	883 - radiator3d (Failed)
	884 - radiator3d_box (Failed)
	885 - radiator3d_box2 (Failed)
	886 - radiator3d_open (Failed)
	887 - radiator3d_radiosity (Failed)
	888 - radiator3d_spectral (Failed)
	889 - radiator3d_symm (Failed)

Again. Anything else than trivial. I'm not a programmer tho and I wanted it to compile with OpenMPI and elmerGUI
Is there info which tests are supposed to fail? Does elmer operate now as expected with those tests failing?

@krivard
Copy link

krivard commented Jul 26, 2024

Hi, I'm also hoping to run Elmer on apple silicon with the GUI enabled.

My system is an M1 running macOS 12.6.3, using:

  • homebrew cmake 3.30.1
  • Apple clang 14.0.0
  • Apple c++ 14.0.0
  • homebrew gfortran 14.0.1

I got a much simpler config to work than @ghostforest 's above:

$ export SDKROOT=/Library/Developer/CommandLineTools/SDKs/MacOSX13.1.sdk
$ MPI_HOME=/opt/homebrew/Cellar/open-mpi/5.0.3_1/ CMAKE_INSTALL_PREFIX=$ELMER_ENV_ROOT/install CFLAGS="-std=gnu89" CMAKE_PREFIX_PATH=/opt/homebrew/Cellar/qt@5//5.15.13_1/lib/cmake cmake -DWITH_QT5:BOOL=ON -DWITH_QWT:BOOL=OFF -DWITH_ELMERGUI:BOOL=ON -DWITH_PARAVIEW:BOOL=ON ../
$ make -j4 install
(SDKROOT solves a bizarre compile error, click to see full message)

cmake found the correct header file location in CMAKE_OSX_SYSROOT, but something else in the build system ignored that in favor of 🤷 :

$ make install
[...]
14 warnings generated.
[  1%] Building C object matc/src/CMakeFiles/matc.dir/str.c.o
[  1%] Building C object matc/src/CMakeFiles/matc.dir/urand.c.o
[  1%] Building C object matc/src/CMakeFiles/matc.dir/variable.c.o
[  1%] Linking C shared library libmatc.dylib
[  1%] Built target matc
[  1%] Building C object matc/src/CMakeFiles/Matc_bin.dir/main.c.o
[  1%] Linking C executable matc
[  1%] Built target Matc_bin
[  1%] Built target umfpack_srcs
[  1%] Generating umfpack_zl_wsolve.c
In file included from [path/to]/elmer/elmerfem/umfpack/src/umfpack/umfpack_solve.c:31:
[path/to]/elmer/elmerfem/umfpack/src/umfpack/include/umf_internal.h:29:10: fatal error: 'string.h' file not found
#include <string.h>
         ^~~~~~~~~~
1 error generated.
make[2]: *** [umfpack/src/umfpack/umfpack_zl_wsolve.c] Error 1
make[2]: *** Deleting file `umfpack/src/umfpack/umfpack_zl_wsolve.c'
make[1]: *** [umfpack/src/umfpack/CMakeFiles/umfpack.dir/all] Error 2
make: *** [all] Error 2

but am also seeing failing tests:

93% tests passed, 69 tests failed out of 922
click for a full list
[...]
The following tests FAILED:
	124 - CurvedBoundaryCylH_np3 (Failed)
	125 - CurvedBoundaryCylH_np8 (Failed)
	243 - FixTangentVelo (Failed)
	249 - HarmonicNS (Failed)
	264 - HelmholtzFEM (Failed)
	271 - HelmholtzStructure (Failed)
	272 - HelmholtzStructure2 (Failed)
	273 - HelmholtzStructure3 (Failed)
	275 - Hybrid2dMeshPartitionMetis_np8 (Failed)
	276 - Hybrid2dMeshPartitionMetisConnect_np8 (Failed)
	278 - HydrostaticNSVec-ISMIP-HOM-C (Failed)
	286 - InductionHeating3 (Failed)
	339 - MazeMeshPartitionMetisContig_np6 (Failed)
	430 - PlatesEigenComplex (Failed)
	431 - PlatesHarmonic (Failed)
	504 - SD_HarmonicNS (Failed)
	512 - SD_P2ndDerivatives (Failed)
	567 - Shell_with_Solid_Beam_EigenComplex (Failed)
	573 - ShoeboxFsiHarmonicPlate (Failed)
	574 - ShoeboxFsiStatic (Failed)
	576 - ShoeboxFsiStaticShell (Failed)
	611 - StressConstraintModes3 (Failed)
	637 - VectorHelmholtzImpMatrix (Failed)
	638 - VectorHelmholtzWaveguide (Failed)
	639 - VectorHelmholtzWaveguide2 (Failed)
	640 - VectorHelmholtzWaveguide3 (Failed)
	641 - VectorHelmholtzWaveguide4 (Failed)
	643 - VectorHelmholtzWaveguideNodal (Failed)
	644 - VectorHelmholtzWaveguideQuadBlock (Failed)
	645 - VectorHelmholtzWaveguide_TM (Failed)
	655 - WinkelPartitionMetis_np8 (Failed)
	656 - WinkelPartitionMetisConnect_np8 (Failed)
	657 - WinkelPartitionMetisRec_np8 (Failed)
	697 - circuits2D_harmonic_stranded_homogenization (Failed)
	698 - circuits2D_scan_harmonics (Failed)
	706 - circuits_harmonic_foil (Failed)
	707 - circuits_harmonic_foil_anl_rotm (Failed)
	708 - circuits_harmonic_foil_wvector (Failed)
	709 - circuits_harmonic_homogenization_coil_solver (Failed)
	710 - circuits_harmonic_massive (Failed)
	711 - circuits_harmonic_stranded (Failed)
	712 - circuits_harmonic_stranded_homogenization (Failed)
	742 - freesurf_maxd_np4 (Failed)
	745 - freesurf_maxd_local_np4 (Failed)
	770 - linearsolvers_cmplx (Failed)
	772 - mgdyn2D_compute_average_b (Failed)
	773 - mgdyn2D_compute_bodycurrent (Failed)
	774 - mgdyn2D_compute_complex_power (Failed)
	777 - mgdyn2D_em_harmonic (Failed)
	778 - mgdyn2D_harmonic_anisotropic_permeability (Failed)
	780 - mgdyn2D_scan_homogenization_elementary_solutions (Failed)
	782 - mgdyn_3phase (Failed)
	786 - mgdyn_airgap_force_np2 (Failed)
	787 - mgdyn_airgap_harmonic (Failed)
	803 - mgdyn_harmonic (Failed)
	804 - mgdyn_harmonic_loss (Failed)
	805 - mgdyn_harmonic_wire (Failed)
	806 - mgdyn_harmonic_wire_Cgauge (Failed)
	807 - mgdyn_harmonic_wire_Cgauge_automatic (Failed)
	808 - mgdyn_harmonic_wire_impedanceBC (Failed)
	809 - mgdyn_harmonic_wire_impedanceBC2 (Failed)
	811 - mgdyn_lamstack_lowfreq_harmonic (Failed)
	813 - mgdyn_lamstack_widefreq_harmonic (Failed)
	818 - mgdyn_steady_coils (Failed)
	832 - mgdyn_thinsheet_harmonic (Failed)
	836 - mgdyn_torus_harmonic (Failed)
	839 - mgdyn_wave_eigen (Failed)
	840 - mydyn_wave_harmonic (Failed)
	866 - pointload2 (Failed)

I notice that the most recent full-test Action on the primary branch had 5 failing tests, so perhaps some failing tests isn't a dealbreaker, but 10x-25x that feels suspicious.

Regardless, I got a tutorial to run all the way through, so if you (or any future travelers) need an exemplar, the above can serve.

@ghostforest
Copy link
Author

ghostforest commented Jul 26, 2024

Thanks for that addition. Yes mine is quite complicated. What does -DWITH_PARAVIEW:BOOL=ON do?
Is your elmer multicore capable without setting OpenMP=TRUE?

make -j$(sysctl -n hw.logicalcpu)
This will utilize the full cpu for compiling.
Same for ctest:
ctest -j$(sysctl -n hw.logicalcpu)

Btw your command does not work for me.

Interesting is that enabling Paraview will lead to more tests failing later.
Same with MPI and OpenMP.

@krivard
Copy link

krivard commented Jul 26, 2024

Paraview is a postprocessor/visualization engine that lets you display the results of a solve; it's referenced in the GUI tutorials (http://www.nic.funet.fi/pub/sci/physics/elmer/doc/ElmerTutorials.pdf) and has to be installed separately (https://www.paraview.org/download/) in such a way that running paraview from the command line will start the application. Elmer support for it turns out to primarily be a convenience; it adds a button to the Elmer GUI that starts up Paraview with your solve vtu file instead of you having to go open Paraview and open the vtu yourself.

As far as multicore goes, parallelization does work and does help, but I'm not super familiar with the differences between Open MPI, OpenMP, and MPI and I'm not certain it's strictly speaking using OpenMP to do it. When I do Run -> Parallel settings -> Use parallel solver, it claims to be using mpirun, and my solve times do indeed drop. Hopefully that gives you the information you need -- if not I'm happy to send output from diagnostic commands if you like.

@ghostforest
Copy link
Author

Afaik MPI is for clustering multiple machines whereas OpenMP is a parallelization library for a single machine. It is a million times easier to implement than pthread since the parallelization of OpenMP happens at compile time. Pthread provides more control but requires much more code

@raback
Copy link
Contributor

raback commented Aug 2, 2024

MPI is historically the way to go with Elmer (and many other FEM codes). Once the problem is distributed the parallelization is trivial (meaning all the physics, assembly etc.). Only solution of linear system requires communication but this is more or less transparent from the user. There is some stuff that has not been implemented with MPI (e.g. viewfactors and general mortar methods). OpenMP is easier but should be done at every bottle-neck. If you consider that there is 500 k lines of code you can understand that it has not been implemented everywhere and lots of code is run on single thread.

@ghostforest
Copy link
Author

Yes! Nobody is complaining about the state of elmer or that it should provide more parallelization.
The discussion was: "how to compile elmer with parallelization capabilities on mac os", but that at least for my part was solved. Elmer compiles with mpi and openmp tho more tests fail than without the parallelizations compiled. I guess some are meant to fail and some are maybe small errors in calculation due to different libraries on os x I guess. Since apple has to do everything differently...

Thank you for the insight this was very interesting to read.

@mmuetzel
Copy link
Contributor

mmuetzel commented Sep 6, 2024

Elmer is now being built regularly on macOS (Intel and Apple Silicon) in CI:
https://github.com/ElmerCSC/elmerfem/blob/devel/.github/workflows/build-macos-homebrew.yaml

More tests are passing when using OpenBLAS instead of Apple Accelerate.
Additionally, make sure that you don't exceed the number of physical cores when using MPI (e.g., -DMPI_TEST_MAXPROC=2 for the CI runners).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants