02-11-2026 07:43 AM
Hello everyone
Want to process 100k records from csv file and dump them in a database
here i'm getting below error
02-11-2026 09:51 AM
Hi @Laxman ,
There is no issue with the current recipe design. Although there may not be a documented upper limit for batch size, it is recommended to determine the practical processing threshold through controlled testing.
For example, if testing shows that the system can process up to 5,000 records in a single batch, it would be prudent not to set the batch size exactly at that limit. Instead, configure it slightly lower (e.g., between 4,500 and 4,800 records).
This buffer helps:
Reduce the risk of memory-related failures
Improve job stability under varying data volumes
Handle unexpected payload size variations
Ensure smoother execution in concurrent environments
Operating slightly below the maximum tested threshold provides better reliability and long-term maintainability.
Thanks and Regards,
Shivakumara K A
02-11-2026 09:23 PM
Hi Shiva,
Thanks for your inputs
given the batch size 100 also getting error like job execution process died
is there any best practices to dump the data
02-12-2026 09:37 PM
Hi @Laxman,
Could you please share the error message once? Also, if the processing of 100 records is failing, we may need to use a child recipe with a concurrency of 5 and the batch size is chosen based on the target application acceptance as @shivakumara mentioned.
02-12-2026 10:38 PM
Hi Rajesh
in step 4 it load the all 100k records at a time which causing memory issue.