issue with MPI #3298
Replies: 5 comments 6 replies
-
Did you remember to start the venv?
On 23 Dec 2023, at 05:14, Qile Yan ***@***.***> wrote:
Hi,
I follow the instructions on parallelism here: https://www.firedrakeproject.org/parallelism.html and installed the firedrake with
python3 firedrake-install --mpiexec=mpiexec --mpicc=mpicc --mpicxx=mpicxx --mpif90=mpif90
But when I try to run script with mpiexec, it says that: No module named 'firedrake'. On the other hand, there is no problem with mpirun. Do you know the reason for that? Thank you very much.
—
Reply to this email directly, view it on GitHub<#3298>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/ABOSV4UTSSUWAZRQCXP5NN3YKZSCDAVCNFSM6AAAAABBARTHZ6VHI2DSMVQWIX3LMV43ERDJONRXK43TNFXW4OZVHE4TANBYGY>.
You are receiving this because you are subscribed to this thread.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
Then I think your Firedrake install can’t have been successful. Does it work if you install without those flags?
On 23 Dec 2023, at 17:03, Qile Yan ***@***.***> wrote:
Yes, I stated the venv.
—
Reply to this email directly, view it on GitHub<#3298 (reply in thread)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/ABOSV4TLJRUJDS7OTG3YE6DYK4FEJAVCNFSM6AAAAABBARTHZ6VHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM3TSMZUHAYDC>.
You are receiving this because you commented.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
I think that if unless you know a lot about your system it is best not to use the flags.
On 23 Dec 2023, at 17:31, Qile Yan ***@***.***> wrote:
Without those flags, both mpiexec and mpirun work. With flags, only mpirun works.
—
Reply to this email directly, view it on GitHub<#3298 (reply in thread)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/ABOSV4T5U5YF3N3JFHD6KWLYK4IOFAVCNFSM6AAAAABBARTHZ6VHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM3TSMZUHA4TG>.
You are receiving this because you commented.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
It’s a reasonable hypothesis that you ran out of memory. In which case it is neither of problem with Firedrake or MPI but just that you need a bigger machine.
You could test the hypothesis by monitoring the memory usage of the job.
On 24 Dec 2023, at 02:06, Qile Yan ***@***.***> wrote:
Thanks for the suggestion.
I am trying to solve a 3D problem. When I installed the firedrake without the flags, and used mpiexec -n 30 python3 xxx.py, it works well on a course mesh but it will give an error message on a refined mesh with degree of freedom about 10^8 : terminated on signal 9 (killed). I think the reason is out of memory.
Is such an issue related with mpi itself or firedrake? Sorry, I am just starting to use those stuffs. Thank you again for your help.
—
Reply to this email directly, view it on GitHub<#3298 (reply in thread)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/ABOSV4U3HP2NEGISOP6ICE3YK6E2XAVCNFSM6AAAAABBARTHZ6VHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM3TSMZWGI2DK>.
You are receiving this because you commented.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
@dham I am using the algebraic multigrid pre-conditioner. Attached is a minimal example. You may need to update with --netgen first. It fails on my machine when degree of freedom is around 10^8~~10^9 It solve -div(grad u)=f with u=0 on boundary. To test the code, just rename the file to be .py. |
Beta Was this translation helpful? Give feedback.
-
Hi,
I follow the instructions on parallelism here: https://www.firedrakeproject.org/parallelism.html and installed the firedrake with
python3 firedrake-install --mpiexec=mpiexec --mpicc=mpicc --mpicxx=mpicxx --mpif90=mpif90
But when I try to run script with mpiexec, it says that: No module named 'firedrake'. On the other hand, there is no problem with mpirun. Do you know the reason for that? Thank you very much.
Beta Was this translation helpful? Give feedback.
All reactions