07-14-2021 12:10 AM
We are planning to leverage workato to consume events from Kafka generated from our product. Did anyone worked on similar use cases and aware of any limitations or gotchas i need to be knowing before implementing it?
07-28-2021 08:00 AM
Hi Rakesh,
Any pitfalls to look out for?
07-31-2021 08:03 AM
Steven Marissens- Yes, there are some like especially for recipes which are consuming huge events from kafka but there are alternatives to avoid this errors. Stopping consumer recipes will loose messages as it’s won’t store latest offset. For high volumes, don’t build huge logic in consumer recipe which will delay Workato jobs and it will reduce the consumption speed(number of batches/minute) from kafka even though if kafka byte size is increased. Workato charges based on tasks so if your integrations deals with high volume then it would be expensive over the time.expensive.
07-31-2021 08:06 AM
Steven Marissens- Yes, there are some like especially for recipes which are consuming huge events from kafka but there are alternatives to avoid this errors. Stopping consumer recipes will loose messages as it’s won’t store latest offset. For high volumes, don’t build huge logic in consumer recipe which will delay Workato jobs and it will reduce the consumption speed(number of batches/minute) from kafka even though if kafka byte size is increased. Workato charges based on tasks so if your integrations deals with high volume then it would be expensive over the time.expensive.
07-31-2021 01:06 PM
Excellent feedback Rakesh, much appreciated. My main concern now is the storage of offset, as it would mean unwanted behaviour when you upgrade the version of the consumer or in case of downtime at Workato. For me it ruins the possible usage of Kafka.
How do you deal with that? Or is this much rather a highly needed enhancement Workato should look into?
07-31-2021 08:07 AM
It’s very hard to debug issues if we consume in batches.