Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CICD tests get stuck very frequently #3591

Closed
germa89 opened this issue Nov 28, 2024 · 5 comments
Closed

CICD tests get stuck very frequently #3591

germa89 opened this issue Nov 28, 2024 · 5 comments
Assignees

Comments

@germa89
Copy link
Collaborator

germa89 commented Nov 28, 2024

Since #3268 , the CICD takes forever, exhausting the timeout configuration. For instance:

https://github.com/ansys/pymapdl/actions/runs/12047613830/job/33590706548?pr=3583

I'm running tests locally on a container which should replicate similar configuration. I'm going to use this issue to report any thoughts or findings.

At the beginning I thought it was the grpc channel getting freeze, that is why I deactivated the _subscribe_to_channel, and the threads that check the stdout. But they didnt fix the issue.

@germa89
Copy link
Collaborator Author

germa89 commented Nov 28, 2024

I have seen twice how MAPDL gets stuck on this line:

*LSBAC,XVIBGC,fz,uz1

which is run in the test:

tests/test_krylov.py::test_krylov_with_pressure_load[L-1] <- ../../../Users/german.ayuso/pymapdl/tests/test_krylov.py 

@germa89
Copy link
Collaborator Author

germa89 commented Nov 28, 2024

When the simulation is stuck, PyMAPDL is still "in control"... since if we use ctrl + C I get this output in the logger:

DEBUG - GRPC_127.0.0.1:50053 -  mapdl_core - run - Running (verbose: False, mute=True): '*LSBAC,XVIBGC,fz,uz1'
INFO - pymapdl_global -  errors - handler - KeyboardInterrupt received.  Waiting until MAPDL execution finishes

Since CTRL+C does not kill the channel or the APDL command, it just waits for it to finish... I would say, the problem is on the gRPC channel or MAPDL not properly handling output file or something.

@germa89
Copy link
Collaborator Author

germa89 commented Nov 28, 2024

Testing with the following script:

export ON_LOCAL=False
export ON_CI=True
export PYMAPDL_START_INSTANCE=False
# export GRPC_VERBOSITY="debug"
# export GRPC_TRACE=api,call,connectivity_state,http_keepalive,http,handshaker
export ANSYSLMD_LICENSE_FILE=REDACTED
export PYMAPDL_LOG_APDL=true
export PYMAPDL_PORT=50053

rm ~/pymapdl/*.log

rm ~/jobs/*
ansysgrpc -port 50053 -dir ~/jobs > ~/pymapdl/apdl.log &

xvfb-run pytest --random-order-seed=567582

Using MAPDL 24.1 container image.

@germa89 germa89 changed the title CICD takes too long CICD tests get stuck very frequently Dec 3, 2024
@germa89 germa89 pinned this issue Dec 3, 2024
@germa89 germa89 self-assigned this Dec 3, 2024
@germa89
Copy link
Collaborator Author

germa89 commented Dec 10, 2024

It seems to be a parsing issue, for example at the end of this:

  Process memory allocated for solver              =     0.403 MB
  Process memory required for in-core solution     =     0.385 MB
  Process memory required for out-of-core solution =     0.385 MB

  Total memory allocated for solver                =     1.427 MB
  Total memory required for in-core solution       =     1.362 MB
  Total memory required for out-of-core solution   =     1.362 MB

 *** NOTE ***                            CP =      63.572   TIME= 13:26:02
 The Distributed Sparse Matrix Solver used by the Block Lanczos          
 eigensolver is currently running in the in-core memory mode.  This      
 memory mode uses the most amount of memory in order to avoid using the  
 hard drive as much as possible, which most often results in the         
 fastest solution time.  This mode is recommended if enough physical     
 memory is present to accommodate all of the solver data.                

  Process memory required for in-core LANCZOS Workspace             =     872.601562 KB
  Process memory required for out-of-core LANCZOS Workspace         =     503.601562 KB

@germa89
Copy link
Collaborator Author

germa89 commented Dec 13, 2024

Closed in #3608

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant