Decode Sparkplug messages before sending to Azure Event Hub


I’m in the situation where we have Sparkplug enabled edge devices communicating over MQTT. This works perfectly with HiveMQ.

Right now we would like to forward all MQTT messages to Azure Event Hub for further processing in Azure Databricks. Ideally the Sparkplug messages (which are protobuf encoded) would be decoded into plain text/json before getting sent to the Event Hub.

So far I’ve discovered:

So unfortunately these do not cover my use case. Are there any options in the HiveMQ ecosystem to do this? Or am I forced to write my own Sparkplug enabled client to decode and forward all messages to Azure Event Hub?

Thank you in advance for any help.

Hi @Frosty,

Nice to see your interest in MQTT, HiveMQ and Sparkplug.
The HiveMQ Kafka Extension comes with a Customization SDK, which can be leveraged for what you are trying to achieve.
I suggest you reach out to our colleagues at and schedule a meeting to dive deeper into your specific requirements.

Florian from the HiveMQ Team.

1 Like

Hi Florian,

Thank you for the great and quick reply. Adding a custom transformer in the Kafka extension could be a good solution.

Are all extensions and customizations also available in HiveMQ Cloud?