One Workflow per user with Kafka connector at a time

Hi Zeebe team,

I want to hear your opinion on my use case and potential feedback would be appreciated.

I have a number of workflows per user e.g. userCreate and userUpdate.
A user can create a profile but the instance is not completed (e.g. transmit data to an external system which is not available at the moment or manual work in the external system is required) → user can already access the profile
The userUpdate instance should only be started, after the userCreate instance is finished (ordering for the external system has to be guaranteed)

userCreate and userUpdate messages are published to Kafka and consumed with the Zeebe Kafka Connector, which starts the instance.
The problem is that both workflows are started, but Zeebe should wait for the createUser instance before it starts the userUpdate instance.

What I could do is create a “entry point” workflow which starts a activity based on the given event type, but this seems wrong to me. Is this is a common problem are there already solutions to it?
Because the Kafka connector always publishes the messages to Zeebe, I am unsure, if I can solve this already there.

Thank you so much and I appreciate your product,
Thomas

Hi Thomas - great to see you here!

Let me check if I understandd your situation.

You have two records on Kafka: “userCreate” and “userUpdate”.
They are put on Kafka from some other parts of the system.
You want to react to them by creating a new workflow instance on Zeebe.
But the “UserUpdateWorkflow” needs to wait, if the “USerCreateWorkflow” is finished?

Can the “userUpdate” record be created independently of the “userCreate” too?

So for example I was thinking if you could wait for the completion of the creation after starting the update:
grafik

Or you might have to check for the user status:

How you do this exactly depends on a couple of factors around your overall architecture (e.g. the above version would work you cannot miss a “UserCreation Completed” Event on Kafka…

Best
Bernd

Hi Bernd,

thank you for your answer!

I have multiple records per User (at the moment around five, potential more in the future (up to 20 different records per user I believe which have to be transmitted to the external system - e.g. 1) user updates his profile 2) user creates a new item 3) user updates his profile → has to be transmitted in this FIFO order). Every user has its own topic, so they do not block each other in case of an error.

So basically I was thinking about creating this simple workflow, to call the process depending on the processId variable.

In my tests, this started only one workflow per user - basically a entrypoint - and is exactly the result I would like to see. However it seems a little “wrong” to me, because a workflow is depended on another workflow?

Only problem I saw at the moment with this approach is the Kafka Connector does not guarantee the order, in which order Zeebe receives the message (network race condition etc., because they are async).
We probably have to do it sync per topic (maybe extend the Kafka connector) which we are currently working on.

Performance wise it should not be a problem, because the bottleneck will always be the external system.

I hope it is more clear, what I’m trying to archive and wanted to hear your opinion on it.

Thank you so much,
Thomas

All event arrive on the same Kafka topic, that’s why you need kind of a “gateway” process to start the right workflow instance based on the event type?

That would be a valid pattern. As an alternative this selection should also be doable using filters (Confluent Documentation | Confluent Documentation) - but somehow I have in the back of my mind that we still need to add something to be able to use kafka cconnect transformations?

Only problem I saw at the moment with this approach is the Kafka Connector does not guarantee the order, in which order Zeebe receives the message (network race condition etc., because they are async).

As far as I see the Zeebe Connector actually keeps the ordering when fetching jobs - but I am not sure if it keeps doing so when you scale out Kafka Connect:

But feel free to adjust the Connector to your needs, actually it is not doing too much anyway, so a customer connector that does exactly what you need might be easier easier to create and maintain.