You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Bots take 10 sec for every edited page as per the current limiting rate. When updating multiple pages via bots can be time-consuming.
For example:
In the case of blackjack bot, which updates the courses with new grade distribution, the no of courses 'updated' is approx 1000, so 10 sec gap extends the process to nearly 3 hrs.
The text was updated successfully, but these errors were encountered:
We can reduce the rate limit to eg. 1 per second, but the point of the rate limit is to prevent a script from accidentally slowing down or taking offline the wiki for regular users. By comparison, it's okay for bots to take a relatively long time. This problem is particularly severe because I think we don't serve multiple requests concurrently, so fixing that is a good first step.
Working on handling multiple requests concurrently would be good.
I overlooked the slowing wiki part but as per the current single request handling capability, the rate limit should not be reduced further. When I was running blackjack, it slept for any time between 9sec and 6 sec. If the rate limit is reduced by a lot factor, then it won't sleep in between edits, leaving way for handling other users' requests.
So, I too agree that serving requests concurrently should be prioritized.
Bots take 10 sec for every edited page as per the current limiting rate. When updating multiple pages via bots can be time-consuming.
For example:
In the case of blackjack bot, which updates the courses with new grade distribution, the no of courses 'updated' is approx 1000, so 10 sec gap extends the process to nearly 3 hrs.
The text was updated successfully, but these errors were encountered: