You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was looking at our mutual customer Intralox's site to see if there were some ways we could speed up their Preview/Production builds.
A couple of things I noticed:
it seems like you're doing a full fetch on every build — which makes sense as you don't have a delta API. For faster iterating on local development, it'd be nice to have a flag to just use the cache if available and avoiding making any fetches.
I completely understand the second tip. There is also room for optimization for skipping some fetches for the multilingual updates.
According to cache - you mean the situation when a developer wants to adjust the code and they basically don't care about the content change in between the builds? So they can turn it on i.e. with some flag in the configuration of the plugin that would try to use the cache first for content fetching?
Do you have some expected boost? My assumption was that the developer creates an isolated environment in Kontent.ai (to prevent changes to purge the fastly CDN), runs the development, and then use the full build and basically relies on Fastly CDN speed.
The alternative might be to use additionalItemFilterParams to event reduce the data amount that is being prefetched.
I was looking at our mutual customer Intralox's site to see if there were some ways we could speed up their Preview/Production builds.
A couple of things I noticed:
await
calls in https://github.com/Kentico/kontent-gatsby-packages/blob/master/packages/gatsby-source-kontent/src/webhookProcessor.ts — it'd be better to collect the API calls and run them in parallel so they all start/finish at the same time. For Intralox the API calls are taking 2-3 seconds so ideally those would all just take ~500ms or so if done in parallel.The text was updated successfully, but these errors were encountered: