-
Notifications
You must be signed in to change notification settings - Fork 4
Challenges & Solutions
While working on the project, following challenges and their corresponding solutions were identified as:
-
Challenge: Added Sub-Module does not appear in the same section as the event log.
Solution: Add package name and dependencies to your .info file so that the sub-module appears under the right section, which is statistics and visualizations in the event_log and log_data_visualization modules respectively.
E.g : package = Statistics
dependencies[] = event_log -
Challenge: No configuration option is available when a new sub-module is added.
Solution: Add configure = admin/config/event_log_file in the corresponding .info file of the module -
Challenge: Path to backup_#.txt in the event_log_backup module is not working.
Solution: Change the path to public://backup_#.txt instead of specifying the complete path. By doing so, a file is automatically created at sites/default/files/ and no permission issues arise. -
Challenge: The feature that logs data to file instead of database contradicts the dependency that events are logged to the database. For “Notify illegal access to authorized user” feature previous fail entries of logs are accessed. But, accessing these entries from the file will be catastrophic because files are accessed sequentially, leading to significant overheads.
Solution: This issue can be resolved by simply disabling this feature when admin enables log to file feature.This is done by using check condition on value present in checkbox event_log_file_enable in module event_log_file. -
Challenge: On enabling the “Ability to log to file instead of database” sub-module the size of file may become too large and hence may become difficult to manage. Also, there should be some centralised storage of files.
Solution: Above problem can be solved by using some type of database partitioning that separates very large databases the into smaller, faster, more easily managed parts. This partitioning is known as Sharding. MongoDB provides an auto-sharding functionality which is not present in MySQL. However, following reasons make it unsuitable to opt for MongoDB against My SQL for event_log:- In drupal there are certain(around 74 tables) that are hard-wired with core drupal, they are stored in memcache so they can’t be completely transferred to mongodb and we require some of those tables for facilitating the “Collaborating communities” project group.
- The performance benefits which MongoDB provides are outweighed by the complexity of trying to extend Drupal features backed by MongoDB. Moreover most of the drupal modules use hard-coded MySQL queries which are incompatible with MongoDB.
- A document storage database like MongoDB is much better suited at server with lots of "reads" very quickly and allows for scaling to multiple servers very easily. So, if you have a large website that servers an enormous amount of content to be read (and not updated) by users, it might be advantageous to use a solution like MongoDB. But, if you have a lot of interactive content with editing and updating, so writes to the database, then MongoDB may not offer any improvements and actually may cause problems with duplication if not properly managed. Event Logging is based mainly on updating the database very often, hence MongoDB may not be useful for it.
- The MongoDB module clearly states what it can store in mongodb: Cache, Field storage, Session, Watchdog, Lock, Block and Queue. However,the required event log table is not included for conversion in the MongoDB module.
The solution to the above challenge is under development and has been implemented on an experimental basis. In this, we are storing the new log entries to file on the cloud by executing a python script and as soon as the threshold for storing the logs in the file is reached, a new file is created and further log entries are done in that file.
- Ability to log to a file instead of a database
- Backup event_log database content
- Clear log messages button
- Notify Illegal access to authorized user
- Log Data to Cloud (Experimental)