Thursday
(Chunk large amounts of data (including files) to make it easier to perform downstream operations. For example: breaking a batch of 5000 records into chunks of 500 to prevent an API timeout.)
I tried above example by taking 5000 records and with 5 columns to chunk to 500 within python code it took 1.3 sec , so I need the conclusion that where to chunk bulk records ?. whether within python code or from external, even after exceeding 90sec or to prevent API timeout.
Thursday
Could you provide more information? Your scenario and question are not clear.
Thursday
Could You please expain the below sentence of workato documentation?
Chunk large amounts of data (including files) (Chunk large amounts of data (including files) to make it easier to perform downstream operations. For example: breaking a batch of 5000 records into chunks of 500 to prevent an API timeout.)