Hi,
I’m in the situation where we have Sparkplug enabled edge devices communicating over MQTT. This works perfectly with HiveMQ.
Right now we would like to forward all MQTT messages to Azure Event Hub for further processing in Azure Databricks. Ideally the Sparkplug messages (which are protobuf encoded) would be decoded into plain text/json before getting sent to the Event Hub.
So far I’ve discovered:
-
How to forward MQTT message to Azure Event Hubs using its Kafka endpoint:
https://www.hivemq.com/blog/connect-hivemq-and-azure-event-hubs/?utm_source=web&utm_medium=AzureWebinar&utm_campaign=Webinar+Promotion
This would result in the encoded messages getting published on the event hub. -
An extension to sent Sparkplug metrics to InfluxDB:
https://www.hivemq.com/extension/sparkplug-influxdb-extension/
We do not use InfluxDB.
So unfortunately these do not cover my use case. Are there any options in the HiveMQ ecosystem to do this? Or am I forced to write my own Sparkplug enabled client to decode and forward all messages to Azure Event Hub?
Thank you in advance for any help.