cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Chunk large amounts of data (including files)

Patel0786
Deputy Chef II
Deputy Chef II

(Chunk large amounts of data (including files) to make it easier to perform downstream operations. For example: breaking a batch of 5000 records into chunks of 500 to prevent an API timeout.)
I tried above example by taking 5000 records and with 5 columns to chunk to 500 within python code it took 1.3 sec ,  so I need the conclusion that where to chunk bulk records ?. whether within python code or from external, even after exceeding 90sec or to prevent API timeout.

1 ACCEPTED SOLUTION

Hi @Patel0786 ,

Solution_1 (without using Python code_ ) Once you receive the payload request try to loop (for loop) as  batch process over the records, mention 500 as your batch size and send the request to Async function call to process it. Process it repeats for the remaining records... 

Thanks and regards,
Shivakumara A

View solution in original post

3 REPLIES 3

gary1
Executive Chef III
Executive Chef III

Could you provide more information? Your scenario and question are not clear.

I have records more than 5000 which is taking more than 90 sec to execute and is getting timeout. How do i make use of phython code block record spliting function in phython to send the records in batch.
Provided in documentation.
https://docs.workato.com/connectors/python.html
 For example: breaking a batch of 5000 records into chunks of 500 to prevent an API timeout.

Hi @Patel0786 ,

Solution_1 (without using Python code_ ) Once you receive the payload request try to loop (for loop) as  batch process over the records, mention 500 as your batch size and send the request to Async function call to process it. Process it repeats for the remaining records... 

Thanks and regards,
Shivakumara A