โ07-14-2021 12:10 AM
We are planning to leverage workato to consume events from Kafka generated from our product. Did anyone worked on similar use cases and aware of any limitations or gotchas i need to be knowing before implementing it?
โ07-14-2021 09:43 AM
We are looking into a similar usecase with either Kafka (self managed) or Confluent Cloud (managed kafka) in a POC phase. We've been able to produce & consume messages using Workato.
I do think the confluent cloud connector could offer a broader configuration / features. Some examples are: consume only from a specific partition instead of all of them (=default). Also not sure how consumer groups would work with Workato (= concurrency greater then 1?).
Apart from that something on our to-do list to test rigorously is the behaviour:
- When a new version of the recipe is promoted, does it impact off-set?
- Impact of starting a clone on off-set
- Impact of an incident at Workato itself
- Handling errors (DLQ topic or something else)
- Monitoring ( x amount messages produced into topic = amount of messages consumed from topic? If not, how to identify delta's)
โ07-15-2021 04:29 PM
We're also using the Confluent connector, and highly recommend Confluent ๐ Workato + Kafka has been pretty straight-forward for us; the main pre-requisite is getting the schema defined correctly (if not already defined). Happy to collaborate on anything if its helpful, and +1 on Steven's point to see if Workato can add broader configuration / features.
โ07-15-2021 06:28 PM
Hi Tony,
We do experience an issue when we consume from a Confluent Cloud topic, we are able to pull messages from the topic into Workato but the consumer never appears in the confluent cloud data flow or under cluser -> clients -> consumer groups.
This is an essential piece for monitoring consumer lag etc.
Have you encountered this issue before?
Thanks in advance for your feedback.
โ07-27-2021 09:40 PM
We builded lot of integrations in Workato in kafka. Happy to share information.