Contact simulations on a cluster #29688
Replies: 3 comments 2 replies
-
Hello LU is good for medium sized problems when the Jacobian is exact. This can be challenging for contact. You ll want to avoid running very many linear iterations like it did when this happened
One way would be to set a loose linear solve tolerance. LU tends to solve very tightly so it should not be an issue. The preferred way would be to set a max number of linear iteraitons to trigger a fail faster |
Beta Was this translation helpful? Give feedback.
-
l_tol is already set to 1e-3, but guess could try it with l_tol = 1e-2 or even 1e-1. Also, the next nl iterations are shorter. 2-5 linear iteration per nonlinear, so I'll try with I worry that the time required to compute LU is the cause. Are there other preconditioners that could be used with contact, or is it only LU? Thanks |
Beta Was this translation helpful? Give feedback.
-
Tried with Is there anything else to try to speed this up?
|
Beta Was this translation helpful? Give feedback.
-
Dear All,
I am working on a contact simulations with 100k-500k elements. Ideally, would run them in 3D but could start with 2D. Friction needs to be included, and since multiple simulations are needed, I plan to run them on our computing cluster. Compiled dependencies using the provided scripts not Conda.
I am testing the 3D Berkovich example in the Contact module on a single node with 8 processes and 16 threads (128 CPUs), but it is very slow. Still ongoing but the first nonlinear iteration took about 50 minutes.
I suspect that LU preconditioner is the cause. Could that be the case? The other examples and documentation show only LU preconditioner. Are there alternative preconditioners or settings I can try to speed this up?
Best regards
Andris Freimanis
edit: this is my Moose output (still running):
Beta Was this translation helpful? Give feedback.
All reactions