Query buffering issues #147
Replies: 5 comments 3 replies
-
Thank you very much for your report. I think we can ignore the last The real issue is the high memory usage. Please tell me more about the problem:
Thank you very much for your help! Kind regards, zepich |
Beta Was this translation helpful? Give feedback.
-
I'm using the docker compose settings, the basic default settings. The same as in the documentation. On a machine with 8GB of RAM and 8 processing cores. It was verified that the container is stuck in 200mb of memory and only scales processing when it receives a high request load from the application. I changed the php.ini files to 256mb in the requests and the problem still continued. I changed Mariadb's buffer size to 512mb, the same problem remained, realizing that it might be a question of query performance in Mariadb. " Oops! An Error OccurredThe server returned a "500 Internal Server Error".
|
Beta Was this translation helpful? Give feedback.
-
Thank you very much for all the details. Can you tell me in which situation it happens? Does it always happen when you try to log in or open mosparo in general? Or only, for example, when you view a large rule or ruleset? Or only, for example, when you have many items per page in the list pages? I think that it is not a MySQL database issue. I think we try to load too much data, like a large ruleset or a large submission, which we try to load from the database and for which we do not have enough memory. Can you tell me more about when it happens? Maybe with the URL path in your browser (without your domain for data privacy). Thank you very much for your help! Kind regards, zepich |
Beta Was this translation helpful? Give feedback.
-
I've created 100k submit tokens and submissions and can reproduce the issue. The issue happens in the cleanup helper, which tries to load all deletable submissions (older than 14 days or older than 24 hours if not submitted or spam), which results in a large amount with 180k submit tokens and submissions. In this cleanup helper, we load the complete submission, which ends in an error because there is simply not enough memory for so many submissions. We need some kind of loop to remove the submissions one by one. I will create an issue (#148) for this problem and work on a solution. Are these 500k submit tokens and submissions from a test/testing environment or a production setup? Kind regards, zepich |
Beta Was this translation helpful? Give feedback.
-
We've just released v1.0.4, which contains a new cleanup logic. This new cleanup logic works differently than before and is optimized for huge amounts of submissions and submit tokens. This logic deletes max. 25'000 submissions at once and will stop after 1.5 seconds. If one of these limits is reached, the cleanup process will be executed again after 10 minutes. Additionally, the cleanup process will be executed every 6 hours (before only every 24 hours). Also, the whole cleanup logic is optimized to use as little memory as possible. I've worked and tested with only 64 MB of memory (I still recommend using at least 128 MB). To clean up the database faster, I recommend using the CLI command with a cronjob. Do you use the command already? The command didn't work before with this huge amount of data, but now it should work perfectly. I'm sorry for the long wait. It was a challenging issue. First, I had to find a solution to create a huge amount of submissions and submit tokens, and after that, the perfect solution for the cleanup logic. Thank you very much for your report and for helping to make mosparo better! I'm looking forward to hearing from you. Please let me know if you need any help. Kind regards, zepich |
Beta Was this translation helpful? Give feedback.
-
I found an error when using mosparo on a large scale, I verified that it is a memory problem. I found a solution that is not so suitable:
https://stackoverflow.com/questions/561066/fatal-error-allowed-memory-size-of-134217728-bytes-exhausted
However, I verified that a different type of lib can also be used to avoid this problem:
https://serverfault.com/questions/400434/php-mysqlnd-or-php-mysql
The error log was:
"
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 16384 bytes) in /mosparo/vendor/doctrine/dbal/src/Driver/PDO/Statement.php on line 121
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 65536 bytes) in /mosparo/vendor/composer/ClassLoader.php on line 571
Fatal error: Uncaught PDOException: SQLSTATE[HY000]: General error: 2014 Cannot execute queries while other unbuffered queries are active. Consider using PDOStatement::fetchAll(). Alternatively, if your code is only ever going to run against mysql, you may enable query buffering by setting the PDO::MYSQL_ATTR_USE_BUFFERED_QUERY attribute. in /mosparo/vendor/symfony/http-foundation/Session/Storage/Handler/PdoSessionHandler.php:353
Stack trace:
#0 /mosparo/vendor/symfony/http-foundation/Session/Storage/Handler/PdoSessionHandler.php(353): PDOStatement->execute()
#1 /mosparo/vendor/symfony/http-foundation/Session/Storage/Handler/AbstractSessionHandler.php(120): Symfony\Component\HttpFoundation\Session\Storage\Handler\PdoSessionHandler->doWrite('078f5ab84ad06b3...', '_sf2_attributes...')
#2 /mosparo/src/Session/DatabaseSessionHandler.php(64): Symfony\Component\HttpFoundation\Session\Storage\Handler\AbstractSessionHandler->write('078f5ab84ad06b3...', '_sf2_attributes...')
#3 /mosparo/vendor/symfony/http-foundation/Session/Storage/Proxy/SessionHandlerProxy.php(73): Mosparo\Session\DatabaseSessionHandler->write('078f5ab84ad06b3...', '_sf2_attributes...')
#4 [internal function]: Symfony\Component\HttpFoundation\Session\Storage\Proxy\SessionHandlerProxy->write('078f5ab84ad06b3...', '_sf2_attributes...')
#5 {main}
thrown in /mosparo/vendor/symfony/http-foundation/Session/Storage/Handler/PdoSessionHandler.php on line 353
"
How can we resolve this? I believe that when you are searching the bank it is listing all the data to get to a point, which is not necessary. This increases memory usage to a maximum, slowing down and crashing the application and generating this error.
Beta Was this translation helpful? Give feedback.
All reactions