cancel
Showing results for 
Search instead for 
Did you mean: 

How to read only one batch from AWS SQS New Messages(batch) trigger

nikhilb
Deputy Chef I
Deputy Chef I

Hello Pros!

We have a scenario to read messages from AWS SQS and having a challenge to read only one batch at a time.
Example: We have 10k messages in SQS, workato trigger(AWS SQS New Message(batch)) reads the messages in max of 2k batches but as soon as it finds 10k, it's reading all 10k in 2k batches to create 5 jobs queued up in our recipe.
Requirement: We want to only read one batch even if there are more than 2k messages and only after processing the batch read another batch.
Can someone guide us on how to handle this?

Thanks in advance!

1 ACCEPTED SOLUTION

gibson-cabello
Workato employee
Workato employee

Hi there!

Can you confirm if you're looking to process the batch one at a time?

Typically, if you have a batch trigger, once you start the recipe, it should generate the jobs based on the batch size set in your trigger but it should process the jobs one at a time.

Can you confirm if you have access to Recipe Concurrency? https://docs.workato.com/recipes/settings.html#concurrency

If so, can you confirm if the recipe concurrency is set to 1?

Thank you!

View solution in original post

2 REPLIES 2

gibson-cabello
Workato employee
Workato employee

Hi there!

Can you confirm if you're looking to process the batch one at a time?

Typically, if you have a batch trigger, once you start the recipe, it should generate the jobs based on the batch size set in your trigger but it should process the jobs one at a time.

Can you confirm if you have access to Recipe Concurrency? https://docs.workato.com/recipes/settings.html#concurrency

If so, can you confirm if the recipe concurrency is set to 1?

Thank you!

Hi!

Thanks for the response!
We're using SQS New Messages trigger and Concurrency set to 1. Like I mentioned, this works fine when we have less than 2k messages to read.
But when the messages are more than the batch size, the trigger automatically reads all the messages(even if there are 100k) and creates jobs( 50 jobs for 100k events). These jobs will not be in 'Processing' state but just in 'queue to be processed'.
We are trying to avoid queuing of 'Jobs' and only create new jobs when the processing one is complete.

Thanks for the help!