You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 10, 2021. It is now read-only.
API responses speed could be dramatically increased if a memory cache like REDIS is implemented.
What do you think about this?
I could give a hand if you need it https://redis.io/topics/client-side-caching
The text was updated successfully, but these errors were encountered:
Without adding new hardware (for the time being) would it be easier to add caching on calls to resources that don't change often but are requested often?
For example we could cache the feed query for even 10 seconds, in memory (JS). We don't really want people sitting there updating the feed more than every 10 seconds anyway right?
We could even add this to random feed which I'm sure would save a LOT of bandwidth. Why update it on each call? 10-30 seconds would be fine to give people time to actually look through the items returned.
All that said it looks like there is some code in a pull request to do some caching, although I don't quite understand it yet? Or at least some throttling. #15
There also seems to be code in the main branch that has been commented out that looks like it was meant to do some caching but unsure as it seems incomplete.
So there are two ways forward if we want to cache.
(1) Add a CloudFront caching layer in front of the Lambda function, caching on the GET request.
(2) Instrument application-level caching using Elasticache or Redis with the route itself.
API responses speed could be dramatically increased if a memory cache like REDIS is implemented.
What do you think about this?
I could give a hand if you need it
https://redis.io/topics/client-side-caching
The text was updated successfully, but these errors were encountered: