โ10-24-2024 02:25 AM
(Chunk large amounts of data (including files) to make it easier to perform downstream operations. For example: breaking a batch of 5000 records into chunks of 500 to prevent an API timeout.)
I tried above example by taking 5000 records and with 5 columns to chunk to 500 within python code it took 1.3 sec , so I need the conclusion that where to chunk bulk records ?. whether within python code or from external, even after exceeding 90sec or to prevent API timeout.
Solved! Go to Solution.
โ10-27-2024 11:12 PM - edited โ10-27-2024 11:13 PM
Hi @Patel0786 ,
Solution_1 (without using Python code_ ) Once you receive the payload request try to loop (for loop) as batch process over the records, mention 500 as your batch size and send the request to Async function call to process it. Process it repeats for the remaining records...
Thanks and regards,
Shivakumara A
โ10-24-2024 08:49 AM
Could you provide more information? Your scenario and question are not clear.
โ10-27-2024 09:32 PM
I have records more than 5000 which is taking more than 90 sec to execute and is getting timeout. How do i make use of phython code block record spliting function in phython to send the records in batch.
Provided in documentation.
https://docs.workato.com/connectors/python.html
For example: breaking a batch of 5000 records into chunks of 500 to prevent an API timeout.
โ10-27-2024 11:12 PM - edited โ10-27-2024 11:13 PM
Hi @Patel0786 ,
Solution_1 (without using Python code_ ) Once you receive the payload request try to loop (for loop) as batch process over the records, mention 500 as your batch size and send the request to Async function call to process it. Process it repeats for the remaining records...
Thanks and regards,
Shivakumara A