Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extremely low TPS #15

Open
LoneDev6 opened this issue Jul 28, 2019 · 7 comments
Open

Extremely low TPS #15

LoneDev6 opened this issue Jul 28, 2019 · 7 comments

Comments

@LoneDev6
Copy link

LoneDev6 commented Jul 28, 2019

Hello, seems that this plugin creates a lot of lag, I tried to tweak everything in the config but nothing helped..
See these timings: https://timings.aikar.co/?id=dc0ddea53d0c4348972f1bd78a698ba9

This is my config (ignore the empty defaults, I tried setting it to empty to try and see if it would benefit TPS with no success): Link

@totemo
Copy link
Member

totemo commented Jul 28, 2019

I'll look into it but I suspect what you are actually seeing is the latency of chunk loading in 1.14.

The time for Minecraft::world - chunkAwait is listed as a child of the MobLimiter event handler there and chunkAwait is taking forever.

You're running 1.14 right? The results for 1.13 looked quite different. Ensure you're on the latest 1.14.4 Paper build and double your RAM allocation, whatever that is (but please do give us the deets).

@totemo
Copy link
Member

totemo commented Jul 28, 2019

Okay, at a quick first glance, ML needs to load a few chunks in the configured radius:
https://github.com/NerdNu/MobLimiter/blob/master/src/nu/nerd/moblimiter/limiters/SpawnLimiter.java#L79
in order to count the entities. 1.14's chunk loading is pretty shakey. Definitely get on 1.14.4 if you aren't already and keep an eye on how much RAM is being used. This code is fairly innocuous really and would not cause a problem if the underlying chunk loading was performant. I have a sneaking suspicion there's a memory leak that is causing GC thrash but I need to get some better diagnostics to confirm that.

@totemo
Copy link
Member

totemo commented Jul 28, 2019

You could also reduce radius right down to 0 so that no extra chunks are loaded when counting.

@LoneDev6
Copy link
Author

Yes, 1.14.4
Intersting, now I try to set radius to 0 and see if TPS stay 19-20

@LoneDev6
Copy link
Author

LoneDev6 commented Aug 2, 2019

TPS is really good, no more TPS drop after I set this config:
https://pastebin.com/AxY4dNWi

@LoneDev6
Copy link
Author

LoneDev6 commented Aug 5, 2019

TPS is really good, no more TPS drop after I set this config:
https://pastebin.com/AxY4dNWi

May I remove age attribute? I see sometimes in timings reports that it seems to use a lot of resources even if i set it on -1
immagine

Would be cool to be able to disable AgeLimiter listener registration

@totemo
Copy link
Member

totemo commented Mar 14, 2020

By the looks of that it's taking a large chunk of one tick (28.73ms is over half the 50ms tick duration) but only infrequently. The 0.01 per tick means that it is running once every 100 ticks (rounded up probably). The average time is 0.37 ms per server tick, which is (0.37ms/50ms) * 100% = 0.74% of your server's available tick time. Less than 1% is what I would consider negligible load.

I'll maybe add an option for that stuff next time I pop the hood on this code.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants