<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Best Practice for Large Recordsets in Workato Pros Discussion Board</title>
    <link>https://systematic.workato.com/t5/workato-pros-discussion-board/best-practice-for-large-recordsets/m-p/9639#M3929</link>
    <description>&lt;P&gt;First off, this is my first recipe and I'm teaching myself from the docs.workato.com plus looking at recipies created by people who preceeded me on this job&lt;BR /&gt;&lt;BR /&gt;I have a recipie that is delivered a payload via JSON over a webhook trigger.&amp;nbsp; It has a lot of records in it ~15k records&lt;/P&gt;&lt;P&gt;What is the best practice for iterating through that much data because what I've seen is that it could take upwards of an hour to nagivate through that many records in other recipies I've looked at and coming from a programming background I have to think there is a better way than just doing a FOR-EACH loop through the whole thing&lt;/P&gt;&lt;P&gt;I cannot use the message queue because it only allows 100 records&amp;nbsp;&lt;/P&gt;&lt;P&gt;How have others tackled processing large recordsets to maintain efficiency and speed?&amp;nbsp; I'm not seeing any batch options available to me on the trigger&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
    <pubDate>Tue, 15 Apr 2025 15:37:49 GMT</pubDate>
    <dc:creator>rharkness</dc:creator>
    <dc:date>2025-04-15T15:37:49Z</dc:date>
    <item>
      <title>Best Practice for Large Recordsets</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/best-practice-for-large-recordsets/m-p/9639#M3929</link>
      <description>&lt;P&gt;First off, this is my first recipe and I'm teaching myself from the docs.workato.com plus looking at recipies created by people who preceeded me on this job&lt;BR /&gt;&lt;BR /&gt;I have a recipie that is delivered a payload via JSON over a webhook trigger.&amp;nbsp; It has a lot of records in it ~15k records&lt;/P&gt;&lt;P&gt;What is the best practice for iterating through that much data because what I've seen is that it could take upwards of an hour to nagivate through that many records in other recipies I've looked at and coming from a programming background I have to think there is a better way than just doing a FOR-EACH loop through the whole thing&lt;/P&gt;&lt;P&gt;I cannot use the message queue because it only allows 100 records&amp;nbsp;&lt;/P&gt;&lt;P&gt;How have others tackled processing large recordsets to maintain efficiency and speed?&amp;nbsp; I'm not seeing any batch options available to me on the trigger&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 15 Apr 2025 15:37:49 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/best-practice-for-large-recordsets/m-p/9639#M3929</guid>
      <dc:creator>rharkness</dc:creator>
      <dc:date>2025-04-15T15:37:49Z</dc:date>
    </item>
    <item>
      <title>Re: Best Practice for Large Recordsets</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/best-practice-for-large-recordsets/m-p/9641#M3931</link>
      <description>&lt;P&gt;It really depends on what you need to do with the records. Can you provide more info?&lt;/P&gt;</description>
      <pubDate>Tue, 15 Apr 2025 17:19:39 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/best-practice-for-large-recordsets/m-p/9641#M3931</guid>
      <dc:creator>gary1</dc:creator>
      <dc:date>2025-04-15T17:19:39Z</dc:date>
    </item>
    <item>
      <title>Re: Best Practice for Large Recordsets</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/best-practice-for-large-recordsets/m-p/9644#M3934</link>
      <description>&lt;P&gt;Gary,&lt;BR /&gt;In this case it's simply a lift and shift of data from source system into the target system as an upsert.&amp;nbsp; It's just a lot of records&lt;/P&gt;</description>
      <pubDate>Tue, 15 Apr 2025 17:57:35 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/best-practice-for-large-recordsets/m-p/9644#M3934</guid>
      <dc:creator>rharkness</dc:creator>
      <dc:date>2025-04-15T17:57:35Z</dc:date>
    </item>
    <item>
      <title>Re: Best Practice for Large Recordsets</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/best-practice-for-large-recordsets/m-p/9645#M3935</link>
      <description>&lt;P&gt;Then ultimately it depends on the target system. If it has a bulk import API or similar option like a CSV upload, then you can process in bulk. However if the target system only has a single record create/edit API, then you're loopin'&lt;/P&gt;</description>
      <pubDate>Tue, 15 Apr 2025 18:06:31 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/best-practice-for-large-recordsets/m-p/9645#M3935</guid>
      <dc:creator>gary1</dc:creator>
      <dc:date>2025-04-15T18:06:31Z</dc:date>
    </item>
    <item>
      <title>Re: Best Practice for Large Recordsets</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/best-practice-for-large-recordsets/m-p/9655#M3939</link>
      <description>&lt;P&gt;Looks like I'll be loopin'!&amp;nbsp; Thanks for the insight&lt;/P&gt;</description>
      <pubDate>Wed, 16 Apr 2025 18:27:47 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/best-practice-for-large-recordsets/m-p/9655#M3939</guid>
      <dc:creator>rharkness</dc:creator>
      <dc:date>2025-04-16T18:27:47Z</dc:date>
    </item>
  </channel>
</rss>

