2 weeks ago
First off, this is my first recipe and I'm teaching myself from the docs.workato.com plus looking at recipies created by people who preceeded me on this job
I have a recipie that is delivered a payload via JSON over a webhook trigger. It has a lot of records in it ~15k records
What is the best practice for iterating through that much data because what I've seen is that it could take upwards of an hour to nagivate through that many records in other recipies I've looked at and coming from a programming background I have to think there is a better way than just doing a FOR-EACH loop through the whole thing
I cannot use the message queue because it only allows 100 records
How have others tackled processing large recordsets to maintain efficiency and speed? I'm not seeing any batch options available to me on the trigger
Solved! Go to Solution.
2 weeks ago
Then ultimately it depends on the target system. If it has a bulk import API or similar option like a CSV upload, then you can process in bulk. However if the target system only has a single record create/edit API, then you're loopin'
2 weeks ago
It really depends on what you need to do with the records. Can you provide more info?
2 weeks ago
Gary,
In this case it's simply a lift and shift of data from source system into the target system as an upsert. It's just a lot of records
2 weeks ago
Then ultimately it depends on the target system. If it has a bulk import API or similar option like a CSV upload, then you can process in bulk. However if the target system only has a single record create/edit API, then you're loopin'
2 weeks ago
Looks like I'll be loopin'! Thanks for the insight