Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

can we avoid large memory allocations and swapping? #40

Open
tdhock opened this issue Aug 27, 2020 · 5 comments
Open

can we avoid large memory allocations and swapping? #40

tdhock opened this issue Aug 27, 2020 · 5 comments

Comments

@tdhock
Copy link

tdhock commented Aug 27, 2020

hi @akhikolla I noticed when testing binsegRcpp that if you pass a large integer as the max_segments argument R may try to do a big memory allocation and freeze the computer (thrashing).
would be nice to automatically avoid that (kill the process instead of thrashing), but I'm not sure how.
@agroce did you ever encounter something similar? any ideas for a solution?
(of course the user could rewrite the code to exit early if there is a big memory allocation but it would be better if we did not require the user to rewrite the code)

@agroce
Copy link

agroce commented Aug 27, 2020

You can just use whatever OS facility to set the process memory limit, maybe? Something somewhere will have to give a limit. I just restrict the integer size to things that go to an allocator, usually, to not be too ridiculously big.

@tdhock
Copy link
Author

tdhock commented Aug 27, 2020

according to https://alex.dzyoba.com/blog/restrict-memory/ ulimit does not work but qemu and lxc-execute do. Do you have any experience with those tools?

@agroce
Copy link

agroce commented Aug 27, 2020

not really; I have used QEMU indirectly, in that some fuzzers use it to carry out their instrumentation. I think it has a pretty serious overhead?

@tdhock
Copy link
Author

tdhock commented Aug 27, 2020

ok so if this turns out to be a big issue I guess we should try lxc-execute first.

@tdhock
Copy link
Author

tdhock commented Oct 23, 2020

right now this is hard-coded, we could allow the user to specify the max vector size as an argument to the R function that does the random input generation, e.g.,

RcppDeepState::deepstate_compile_run(max.vector.length=1000)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants