Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

'Failed to limit memory to' error on Mac M3 Max, while SPAdes runs normally via a virtual Linux machine on the same computer. #1324

Open
1 task done
mkazanov opened this issue Jul 2, 2024 · 6 comments
Labels
io Input / output (e.g. due to disk storage) errors

Comments

@mkazanov
Copy link

mkazanov commented Jul 2, 2024

Description of bug

The same FASTQ files are processed normally in a virtual Linux machine (Ubuntu 22.04.02 ARM64) on a Mac M3 Max with macOS Sonoma, but fail with a 'failed to limit memory' error when executed purely in macOS.

spades.log

spades.log

params.txt

params.txt

SPAdes version

v4.0.0

Operating System

macOS Sonoma 14.1

Python Version

3.9.6

Method of SPAdes installation

binaries

No errors reported in spades.log

  • Yes
@asl
Copy link
Member

asl commented Jul 2, 2024

The error in log is completely unrelated to that warning.

First of all, starting from Monterey it is not possible to really limit the used memory in many circumstances unless one is not running under root. So, we demoted error and produce a warning here:

  0:00:00.000     1M / 16M   WARN    General                 (memory_limit.cpp          :  52)   Failed to limit memory to 250 Gb, setrlimit(2) call failed, errno = 22 (Invalid argument). Watch your memory consumption!

Going back to the error, it is an I/O error: SPAdes is unable to open one of its intermediate files. So, I'd suppose you to check system log & free disk space available.

@asl asl added the io Input / output (e.g. due to disk storage) errors label Jul 2, 2024
@mkazanov
Copy link
Author

mkazanov commented Jul 2, 2024

The system has 4TB free space. Error is reproducible.
There were no any disk errors to this moment.
What do you recommend?

@asl
Copy link
Member

asl commented Jul 2, 2024

The system has 4TB free space. Error is reproducible. What do you recommend?

Check system limits on number of open files? E.g. via ulimit -n? Usually it is very low on MacOS (as compared to Linux)

@mkazanov
Copy link
Author

mkazanov commented Jul 2, 2024

I've increased to 1,000,000 ulimit -S -n 1000000 but result is the same

@asl
Copy link
Member

asl commented Jul 2, 2024

I've increased to 1,000,000 ulimit -S -n 1000000 but result is the same

Did it really increase the limit? Have you checked via ulimit -a? Note that you cannot increase soft limit beyond the hard one.

@mkazanov
Copy link
Author

mkazanov commented Jul 2, 2024

Yes, I've checked by ulimit -S -n.
The hard limit ulimit -H -n is unlimited.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
io Input / output (e.g. due to disk storage) errors
Projects
None yet
Development

No branches or pull requests

2 participants