โ04-27-2021 02:03 PM
Has anyone used a caching strategy with a recipe that's running in the API Platfrom (so reachable via REST)? My recipe will be called via Javascript from a webpage so performance is key. It will be doing a simple query against Dynamics CRM and the data is pretty static, so I'd like to minimize those hits against CRM. One thought I had was to set a cache header in the HTTP response but I didn't see a way to customize that.
I tried doing a test send from Postman and passing:
Cache-Control: public, max-age=604800, immutable
But no matter what the response comes back as:
Cache-Control: max-age=0, private, must-revalidate
So then I wondered about dumping the data into a Lookup table and pulling from there until they "expire" and then pulling from CRM, maybe once per day, but since I can't control the timing (since people can hit the page anytime), there would always be a risk of timing where I might be refreshing the data in that second. I had hoped to have concurrency at 5 for performance reasons but I realize going to 1 might be a way, but I'd hate to have requests backed up behind others too much.
Curious how others have handled caching in Workato?
Thanks
โ04-27-2021 02:19 PM
I also encourage to have a kind of in-mem cache built into Workato. I have the same challange and currently need to store data in my OutSystems app to avoid going out to sources too frequently. Probably Product@Workato has an idea or plans on such solutions. Thanks!
โ04-27-2021 02:39 PM
Mike and Gernot, thank you for sharing your use cases. API caching is a feature we plan to release this year. This will allow you, the API publisher, to control how you want to configure the cache (duration, cache key, etc.) to minimize the traffic to your data sources.
โ04-27-2021 02:41 PM
I have used Lookup Tables for caching when dealing with high load situations - and it worked well for my use case.
Instead of doing a dataload for the data in the cache, you could design the recipe to:
1. Check to see if the data exists in the lookup table and hasn't expired
2. If missing or expired, fetch the data from Dynamics and populate the lookup table
3. Return result
You could keep concurrency at 5 and at worst if you get 5 requests for the same data at the exact same time when it doesn't exist in the cache yet. You won't have to do a daily data dump. If you truly want to limit hits to the CRM - just don't purge the data from the Lookup Table and do updates (that way the data is never "missing").
โ04-27-2021 03:00 PM
Thanks Mike. In my case they're not requesting individual records, it's actually the whole list each time. But maybe I can make a separate recipe that runs each morning at 4 AM, pulls the data from CRM, then UPDATEs the Lookup table one row at a time along with a refresh date. Then remove any rows that didn't get refreshed (b/c they were removed from CRM). That way nothing is being truncated, like you say. Thanks Mike!