<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Processing 100k records from csv file in Workato Pros Discussion Board</title>
    <link>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11826#M4552</link>
    <description>&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Screenshot 2026-02-11 210546.png" style="width: 999px;"&gt;&lt;img src="https://systematic.workato.com/t5/image/serverpage/image-id/2574iD1D005FC1FF0A77D/image-size/large/is-moderation-mode/true?v=v2&amp;amp;px=999" role="button" title="Screenshot 2026-02-11 210546.png" alt="Screenshot 2026-02-11 210546.png" /&gt;&lt;/span&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Screenshot 2026-02-11 211226.png" style="width: 999px;"&gt;&lt;img src="https://systematic.workato.com/t5/image/serverpage/image-id/2575i42AACA23765D6C91/image-size/large/is-moderation-mode/true?v=v2&amp;amp;px=999" role="button" title="Screenshot 2026-02-11 211226.png" alt="Screenshot 2026-02-11 211226.png" /&gt;&lt;/span&gt;Hello everyone&lt;/P&gt;&lt;P&gt;Want to process 100k records from csv file and dump them in a database&lt;BR /&gt;here i'm getting below error&lt;/P&gt;&lt;H4&gt;Error&lt;/H4&gt;&lt;DIV class=""&gt;Job has been shut down by the system. Usually it happens due to high memory consumption. Try reworking the recipe to process less amount of data per step.&lt;/DIV&gt;&lt;DIV class=""&gt;in for each the batch size given as 30000&lt;/DIV&gt;&lt;DIV class=""&gt;here how the data will be processed to database...?&lt;/DIV&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV class=""&gt;&amp;nbsp;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;/DIV&gt;&lt;DIV class=""&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;&amp;nbsp;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;/DIV&gt;</description>
    <pubDate>Wed, 11 Feb 2026 15:43:47 GMT</pubDate>
    <dc:creator>Laxman</dc:creator>
    <dc:date>2026-02-11T15:43:47Z</dc:date>
    <item>
      <title>Processing 100k records from csv file</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11826#M4552</link>
      <description>&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Screenshot 2026-02-11 210546.png" style="width: 999px;"&gt;&lt;img src="https://systematic.workato.com/t5/image/serverpage/image-id/2574iD1D005FC1FF0A77D/image-size/large/is-moderation-mode/true?v=v2&amp;amp;px=999" role="button" title="Screenshot 2026-02-11 210546.png" alt="Screenshot 2026-02-11 210546.png" /&gt;&lt;/span&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Screenshot 2026-02-11 211226.png" style="width: 999px;"&gt;&lt;img src="https://systematic.workato.com/t5/image/serverpage/image-id/2575i42AACA23765D6C91/image-size/large/is-moderation-mode/true?v=v2&amp;amp;px=999" role="button" title="Screenshot 2026-02-11 211226.png" alt="Screenshot 2026-02-11 211226.png" /&gt;&lt;/span&gt;Hello everyone&lt;/P&gt;&lt;P&gt;Want to process 100k records from csv file and dump them in a database&lt;BR /&gt;here i'm getting below error&lt;/P&gt;&lt;H4&gt;Error&lt;/H4&gt;&lt;DIV class=""&gt;Job has been shut down by the system. Usually it happens due to high memory consumption. Try reworking the recipe to process less amount of data per step.&lt;/DIV&gt;&lt;DIV class=""&gt;in for each the batch size given as 30000&lt;/DIV&gt;&lt;DIV class=""&gt;here how the data will be processed to database...?&lt;/DIV&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV class=""&gt;&amp;nbsp;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;/DIV&gt;&lt;DIV class=""&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;&amp;nbsp;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;/DIV&gt;</description>
      <pubDate>Wed, 11 Feb 2026 15:43:47 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11826#M4552</guid>
      <dc:creator>Laxman</dc:creator>
      <dc:date>2026-02-11T15:43:47Z</dc:date>
    </item>
    <item>
      <title>Re: Processing 100k records from csv file</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11828#M4553</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://systematic.workato.com/t5/user/viewprofilepage/user-id/12505"&gt;@Laxman&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;There is no issue with the current recipe design. Although there may not be a documented upper limit for batch size, it is recommended to determine the practical processing threshold through controlled testing.&lt;/P&gt;&lt;P&gt;For example, if testing shows that the system can process up to 5,000 records in a single batch, it would be prudent not to set the batch size exactly at that limit. Instead, configure it slightly lower (e.g., between 4,500 and 4,800 records).&lt;/P&gt;&lt;P&gt;This buffer helps:&lt;BR /&gt;Reduce the risk of memory-related failures&lt;BR /&gt;Improve job stability under varying data volumes&lt;BR /&gt;Handle unexpected payload size variations&lt;BR /&gt;Ensure smoother execution in concurrent environments&lt;/P&gt;&lt;P&gt;Operating slightly below the maximum tested threshold provides better reliability and long-term maintainability.&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;Thanks and Regards,&lt;BR /&gt;Shivakumara K A&lt;/P&gt;</description>
      <pubDate>Wed, 11 Feb 2026 17:51:01 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11828#M4553</guid>
      <dc:creator>shivakumara</dc:creator>
      <dc:date>2026-02-11T17:51:01Z</dc:date>
    </item>
    <item>
      <title>Re: Processing 100k records from csv file</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11833#M4558</link>
      <description>&lt;P&gt;Hi Shiva,&lt;/P&gt;&lt;P&gt;Thanks for your inputs&lt;/P&gt;&lt;P&gt;given the batch size 100 also getting error like job execution process died&lt;/P&gt;&lt;P&gt;is there any best practices to dump the data&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 12 Feb 2026 05:23:45 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11833#M4558</guid>
      <dc:creator>Laxman</dc:creator>
      <dc:date>2026-02-12T05:23:45Z</dc:date>
    </item>
    <item>
      <title>Re: Processing 100k records from csv file</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11849#M4567</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://systematic.workato.com/t5/user/viewprofilepage/user-id/12505"&gt;@Laxman&lt;/a&gt;,&lt;/P&gt;&lt;P&gt;Could you please share the error message once? Also, if the processing of 100 records is failing, we may need to use a child recipe with a concurrency of 5 and the batch size is chosen based on the target application acceptance as &lt;a href="https://systematic.workato.com/t5/user/viewprofilepage/user-id/8685"&gt;@shivakumara&lt;/a&gt;&amp;nbsp;mentioned.&lt;/P&gt;</description>
      <pubDate>Fri, 13 Feb 2026 05:37:05 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11849#M4567</guid>
      <dc:creator>rajeshjanapati</dc:creator>
      <dc:date>2026-02-13T05:37:05Z</dc:date>
    </item>
    <item>
      <title>Re: Processing 100k records from csv file</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11850#M4568</link>
      <description>&lt;P&gt;Hi Rajesh&lt;BR /&gt;in step 4&amp;nbsp; it load the all 100k records at a time which causing memory issue.&lt;/P&gt;</description>
      <pubDate>Fri, 13 Feb 2026 06:38:56 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11850#M4568</guid>
      <dc:creator>Laxman</dc:creator>
      <dc:date>2026-02-13T06:38:56Z</dc:date>
    </item>
    <item>
      <title>Re: Processing 100k records from csv file</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11861#M4571</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://systematic.workato.com/t5/user/viewprofilepage/user-id/12505"&gt;@Laxman&lt;/a&gt;,&lt;/P&gt;&lt;P&gt;We can able to implement this with do-while action... as it is a CSV File, you can go through a batch-offset approach with GET LINES from CSV File action in Workato FileStorage. Please go through the attached images below, and if you get stuck at a point, please let me know. This solution will work for sure.&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Screenshot 2026-02-13 165840.png" style="width: 999px;"&gt;&lt;img src="https://systematic.workato.com/t5/image/serverpage/image-id/2584iB090BEA2002868C4/image-size/large/is-moderation-mode/true?v=v2&amp;amp;px=999" role="button" title="Screenshot 2026-02-13 165840.png" alt="Screenshot 2026-02-13 165840.png" /&gt;&lt;/span&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Screenshot 2026-02-13 165531.png" style="width: 999px;"&gt;&lt;img src="https://systematic.workato.com/t5/image/serverpage/image-id/2585i3853F70CEECCAFDD/image-size/large/is-moderation-mode/true?v=v2&amp;amp;px=999" role="button" title="Screenshot 2026-02-13 165531.png" alt="Screenshot 2026-02-13 165531.png" /&gt;&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 13 Feb 2026 11:35:41 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11861#M4571</guid>
      <dc:creator>rajeshjanapati</dc:creator>
      <dc:date>2026-02-13T11:35:41Z</dc:date>
    </item>
    <item>
      <title>Re: Processing 100k records from csv file</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11865#M4572</link>
      <description>&lt;P&gt;Hi Rajesh,&lt;/P&gt;&lt;P&gt;now i'm getting 100k rows in output getting db related issues now like bad gateway and&amp;nbsp;&lt;SPAN&gt;Connection reset by peer&lt;/SPAN&gt;&amp;nbsp;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Capture1.JPG" style="width: 753px;"&gt;&lt;img src="https://systematic.workato.com/t5/image/serverpage/image-id/2587i3B4233125253202F/image-size/large/is-moderation-mode/true?v=v2&amp;amp;px=999" role="button" title="Capture1.JPG" alt="Capture1.JPG" /&gt;&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 13 Feb 2026 11:54:12 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11865#M4572</guid>
      <dc:creator>Laxman</dc:creator>
      <dc:date>2026-02-13T11:54:12Z</dc:date>
    </item>
    <item>
      <title>Re: Processing 100k records from csv file</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11868#M4573</link>
      <description>&lt;P&gt;&lt;a href="https://systematic.workato.com/t5/user/viewprofilepage/user-id/12505"&gt;@Laxman&lt;/a&gt;&amp;nbsp;please follow my previous post and guidance, I have&lt;BR /&gt;1. created a variable&lt;BR /&gt;2. added DO-WHILE action block&lt;BR /&gt;3. inside DO-WHILE block, you need to call GET LINES FROM CSV File action and in the BATCH OFFSET, you need to map the variable you have created in 2nd step.&lt;BR /&gt;4. in the while loop condition you need to map the HAS MORE rows pill (as shown in images in previoyus reply)&lt;/P&gt;&lt;P&gt;Hope you got my point?&lt;/P&gt;</description>
      <pubDate>Fri, 13 Feb 2026 13:51:02 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11868#M4573</guid>
      <dc:creator>rajeshjanapati</dc:creator>
      <dc:date>2026-02-13T13:51:02Z</dc:date>
    </item>
    <item>
      <title>Re: Processing 100k records from csv file</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11869#M4574</link>
      <description>&lt;P&gt;Hi Rajesh,&lt;BR /&gt;Thanks for your valuble sujjetions&amp;nbsp;&lt;BR /&gt;please share me the recipe if possible that would be much appreciated.&lt;/P&gt;</description>
      <pubDate>Fri, 13 Feb 2026 14:39:31 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11869#M4574</guid>
      <dc:creator>Laxman</dc:creator>
      <dc:date>2026-02-13T14:39:31Z</dc:date>
    </item>
    <item>
      <title>Re: Processing 100k records from csv file</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11872#M4575</link>
      <description>&lt;P&gt;sure, here is the recipe link&lt;BR /&gt;recipe link:&amp;nbsp;&lt;A href="https://app.trial.workato.com/recipes/145225?st=70149a74216f90c7cd686cd8bd32984f558a964774ff412a629b8beed6060ce9" target="_self"&gt;100K records handling recipe with csv file from File Storage&lt;/A&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 13 Feb 2026 16:25:28 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11872#M4575</guid>
      <dc:creator>rajeshjanapati</dc:creator>
      <dc:date>2026-02-13T16:25:28Z</dc:date>
    </item>
    <item>
      <title>Re: Processing 100k records from csv file</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11873#M4576</link>
      <description>&lt;P&gt;Hi Rajesh,&lt;BR /&gt;Thanks alot for your support actually get lines from csv from workato file storage able to load 50k records at a time we need to take it twice for 100k after that using for each 2 times to dump them in db&lt;BR /&gt;let me know is it good practice or not..?&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 13 Feb 2026 16:54:13 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11873#M4576</guid>
      <dc:creator>Laxman</dc:creator>
      <dc:date>2026-02-13T16:54:13Z</dc:date>
    </item>
    <item>
      <title>Re: Processing 100k records from csv file</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11874#M4577</link>
      <description>&lt;P&gt;as per my recipe, from 4th step you will get 50K records, after that based on batch size supported by your DB take for each action if doesn't supports 50K records to load into DB or directly map 50K records to DB. hope you got my point?&lt;BR /&gt;&lt;BR /&gt;FOR-EACH action is considered based on batch size supported by DB otherwise not needed.&lt;/P&gt;</description>
      <pubDate>Fri, 13 Feb 2026 17:32:42 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/processing-100k-records-from-csv-file/m-p/11874#M4577</guid>
      <dc:creator>rajeshjanapati</dc:creator>
      <dc:date>2026-02-13T17:32:42Z</dc:date>
    </item>
  </channel>
</rss>

