You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Just to update this — I split my list into 20k chunks, and it's taking at least a day to import each chuck. Often timing out so I have to extend the timeout to 20 hours to try and process the chunks successfully.
if anyone had any suggestions on how to go about this better/quicker, it would be super appreciated.
I have a Digital Ocean droplet with 32GB of ram and 4 vCPUs. The CPU usage never really gets past 6% in total.
I'm running php-fpm with pool settings that are plenty to satisfy the traffic on the server.
Im using Digital Ocean hosted database, and also upgraded it to 16 GB RAM / 6vCPUs
None of these actions seem to be making any kind if noticeable improvement on the import speed.
I need to get this data in, and ultimately more data into the system, but doing it over days and weeks is not sustainable.
Description
I have a JSON file of 300,000+ records each containing:
First name
(plaintext)Last Name
(plaintext)Address
(plaintext)Visitor ID
(num)Region
(Category via Title string)Preference ID
(num)Things I have done:
Entry
Add Entry
andUpdate Entry
using theVisitor ID
as the unique identifier as there are a few thousand records in the system already.htop
shows the CPU (4 cores) is barely being used (cores bouncing between 1-10% each) and memory is just under half used.Status
Questions:
Additional info
AsyncQueue 4.0.0
Button Box 5.0.0
Campaign dev-develop
CKEditor 4.2.0
Code Field 5.0.0
Control Panel CSS 3.0.0
Cookies 5.0.0
Dashboard Begone 3.0.0
Expanded Singles 3.0.0
Feed Me 6.3.0
Field Manager 4.0.2
Formie 3.0.6
Icon Picker 3.0.1
Knock Knock 3.0.1
Neo 5.2.5
SendGrid 3.0.0
SEO 5.1.3
Single Cat 4.0.0
Sprig 3.5.0
Timber 2.0.2
The text was updated successfully, but these errors were encountered: