Problems with Kafka Connector



I’m on Zeebe 0.17.0 following the guide for Writing an Apache Kafka Connector for Zeebe experiencing some problems.

bin/ config/ config/

[2019-04-27 12:23:19,084] INFO Kafka version : 2.1.0 (org.apache.kafka.common.utils.AppInfoParser:109)

[2019-04-27 12:23:19,084] INFO Kafka commitId : 809be928f1ae004e (org.apache.kafka.common.utils.AppInfoParser:110)

[2019-04-27 12:23:19,090] INFO Connecting to Zeebe broker at 'localhost:26500' (io.berndruecker.demo.kafka.connect.zeebe.ZeebeSourceTask:50)

[2019-04-27 12:23:19,091] INFO Created connector ZeebeSourceConnector (org.apache.kafka.connect.cli.ConnectStandalone:104)

[2019-04-27 12:23:19,366] INFO Subscribed to Zeebe at 'localhost:26500' for sending records (io.berndruecker.demo.kafka.connect.zeebe.ZeebeSourceTask:67)

[2019-04-27 12:23:19,366] INFO WorkerSourceTask{id=ZeebeSourceConnector-0} Source task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSourceTask:199)

[2019-04-27 12:23:19,643] INFO WorkerSourceTask{id=ZeebeSourceConnector-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:397)

[2019-04-27 12:23:19,643] INFO WorkerSourceTask{id=ZeebeSourceConnector-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:414)

[2019-04-27 12:23:19,643] ERROR WorkerSourceTask{id=ZeebeSourceConnector-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:177)


at io.berndruecker.demo.kafka.connect.zeebe.ZeebeSourceTask.poll(

at org.apache.kafka.connect.runtime.WorkerSourceTask.poll(

at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(

at org.apache.kafka.connect.runtime.WorkerTask.doRun(


at java.util.concurrent.Executors$


at java.util.concurrent.ThreadPoolExecutor.runWorker(

at java.util.concurrent.ThreadPoolExecutor$


[2019-04-27 12:23:19,644] INFO Activated 1 jobs for worker KafkaConnector and job type sendMessage (io.zeebe.client.job.poller:97)

[2019-04-27 12:23:19,645] ERROR WorkerSourceTask{id=ZeebeSourceConnector-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:178)

[2019-04-27 12:23:19,649] INFO [Producer clientId=producer-1] Closing the Kafka producer with timeoutMillis = 30000 ms. (org.apache.kafka.clients.producer.KafkaProducer:1136)


Did you configure the Kafka Topic(s) to listen to?

According to your exception this is null:

This is set via “topics”, see example in


Hi Bernd,

I’m trying out your ZeebeSourceConnector. Here’s my config/ file:




Thanks for helping.


Jonas, there was a bug in reading the configuration. I just fixed that in master now - could you try again?
As I haven’t really used the Sink in my demos that slipped to the cracks - sorry.



I built a new package and replaced the old jar, started up Kafka (kafka_2.12-2.1.0), Zeebe (zeebe-broker-0.17.0) and then started the Connector with some new errors:

[2019-04-29 09:18:34,857] ERROR Failed to start task ZeebeSourceConnector-0 (org.apache.kafka.connect.runtime.Worker:455)
java.lang.NoClassDefFoundError: io/zeebe/client/api/subscription/JobHandler
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(
	at org.apache.kafka.common.config.ConfigDef.parseType(
	at org.apache.kafka.common.config.ConfigDef.parseValue(
	at org.apache.kafka.common.config.ConfigDef.parse(
	at org.apache.kafka.common.config.AbstractConfig.<init>(
	at org.apache.kafka.connect.runtime.TaskConfig.<init>(
	at org.apache.kafka.connect.runtime.Worker.startTask(
	at org.apache.kafka.connect.runtime.standalone.StandaloneHerder.createConnectorTasks(
	at org.apache.kafka.connect.runtime.standalone.StandaloneHerder.updateConnectorTasks(
	at org.apache.kafka.connect.runtime.standalone.StandaloneHerder.putConnectorConfig(
	at org.apache.kafka.connect.cli.ConnectStandalone.main(
Caused by: java.lang.ClassNotFoundException: io.zeebe.client.api.subscription.JobHandler
	at java.lang.ClassLoader.loadClass(
	at org.apache.kafka.connect.runtime.isolation.PluginClassLoader.loadClass(
	at java.lang.ClassLoader.loadClass(
	... 12 more

I could share the full log with you on Slack.

Btw: My intention was to try out the event source, not the sink. Am I wrong here?


Did you make sure that you use the “uber jar”? That’s the big one generated:

If you look into that via a ZIP tool of your choice the class should be contained within there:

Make sure this JAR is in the plugins directoy of Kafka, so that it can be picked up properly.


PS: I am not on slack as I am actually officially on leave right now :wink:


You’re right I copied the wrong JAR.

I’m now getting events on my topic:


But they don’t make any sense, do they?


Depends on your definition of “sense” :wink: I understand why you see this - it is actually the byte array of your JSON. This is set here:

I think you should see a proper JSON if you change it to:

    final SourceRecord record = new SourceRecord(null, null, topic, //
        Schema.STRING_SCHEMA, //

Payload handling is actually something which might need a second thought for your environment anyway - as I am not sure what you want to send via Kafka records. Json? Avro? …?


I just thought I made some misstake - but if that’s the expected result I’m good. I was looking for something else (without reading the connector serialization code), that the Kafka event would be the set of variables (JSON formatted) defined in the scope of the process definition.

Now I’ll try the BPMN catch event and a Kafka connect sink.

EDIT: And yes it worked fine to change to Schema.STRING_SCHEMA.


Great! I actually changed the default to STRING_SCHEMA, as it might be less confusing for the beginning. Thanks for the feedback.

By the way: If you can share anything about your use case I would be interested :slight_smile: