cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Best Practice for Large Recordsets

rharkness
Deputy Chef I
Deputy Chef I

First off, this is my first recipe and I'm teaching myself from the docs.workato.com plus looking at recipies created by people who preceeded me on this job

I have a recipie that is delivered a payload via JSON over a webhook trigger.  It has a lot of records in it ~15k records

What is the best practice for iterating through that much data because what I've seen is that it could take upwards of an hour to nagivate through that many records in other recipies I've looked at and coming from a programming background I have to think there is a better way than just doing a FOR-EACH loop through the whole thing

I cannot use the message queue because it only allows 100 records 

How have others tackled processing large recordsets to maintain efficiency and speed?  I'm not seeing any batch options available to me on the trigger

1 ACCEPTED SOLUTION

gary1
Executive Chef III
Executive Chef III

Then ultimately it depends on the target system. If it has a bulk import API or similar option like a CSV upload, then you can process in bulk. However if the target system only has a single record create/edit API, then you're loopin'

View solution in original post

4 REPLIES 4

gary1
Executive Chef III
Executive Chef III

It really depends on what you need to do with the records. Can you provide more info?

Gary,
In this case it's simply a lift and shift of data from source system into the target system as an upsert.  It's just a lot of records

gary1
Executive Chef III
Executive Chef III

Then ultimately it depends on the target system. If it has a bulk import API or similar option like a CSV upload, then you can process in bulk. However if the target system only has a single record create/edit API, then you're loopin'

Looks like I'll be loopin'!  Thanks for the insight