cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Bring 50k from SF

daniel-chupak
Deputy Chef I
Deputy Chef I

Hi,


Has anyone tried to bring about 50k records from SF using the custom action and pagination method?

The issue is that to collect all the data I use the 'Accumulate to list' action and it seems to take too much time which makes me exceed the 90 minutes limit.


I was wondering if anyone has found another solution?

6 REPLIES 6

ajorde
Deputy Chef II
Deputy Chef II

If I need to grab more than 2K lines from SF, I use the bulk > csv action (Just make sure that recipe is started so that it wakes up 5 minutes later to finish the bulk action). Also, you should use mapper / common data models functionality rather than accumulate to list to bulk map the data in one action rather than looping through each record.

mroldanvega
Executive Chef I
Executive Chef I

Bulk is the way to go, I agree with Amy Jorde . This is what we use for these types of use cases. Keep in mind that Bulk doesn't seem to work in Test mode, only when the recipe is active.

daniel-chupak
Deputy Chef I
Deputy Chef I

Alright! thank you all. ๐Ÿ™‚

john-colburn
Deputy Chef I
Deputy Chef I

I ran into a similar problem downloading a large SF report and looping through each line. You need to add an IF statement at the top of your loop to check the count of the index you are looping through and pause for .5 seconds every 4000-5000 rows. This will allow your job to run without timing out after 90 minutes