โ05-27-2021 03:23 PM
Hi,
Has anyone tried to bring about 50k records from SF using the custom action and pagination method?
The issue is that to collect all the data I use the 'Accumulate to list' action and it seems to take too much time which makes me exceed the 90 minutes limit.
I was wondering if anyone has found another solution?
โ05-27-2021 03:32 PM
If I need to grab more than 2K lines from SF, I use the bulk > csv action (Just make sure that recipe is started so that it wakes up 5 minutes later to finish the bulk action). Also, you should use mapper / common data models functionality rather than accumulate to list to bulk map the data in one action rather than looping through each record.
โ05-27-2021 03:59 PM
Bulk is the way to go, I agree with Amy Jorde . This is what we use for these types of use cases. Keep in mind that Bulk doesn't seem to work in Test mode, only when the recipe is active.
โ05-27-2021 04:49 PM
Alright! thank you all. ๐
โ05-27-2021 05:13 PM
I ran into a similar problem downloading a large SF report and looping through each line. You need to add an IF statement at the top of your loop to check the count of the index you are looping through and pause for .5 seconds every 4000-5000 rows. This will allow your job to run without timing out after 90 minutes