Serialization of kafka metrics failed

Hi All,

I am new to HiveMQ and my team is using HiveMQ with MQTT protocol for a project. In that we are using hivemq-kafka extension to get the data from hivemq to Kafka. Currently I am facing an issue with respect to the Serialization of Kafka metrices in the Broker logs. I could see in the issue is that the message or record sent is large than allowed size. I could see that it is expecting 5mb of data but receiving 9mb.

Can anyone say in detail say how to configure that message value to increase the message size and where that size has to be configured so that the issue I am facing now will get resolved.

I have attached the Screenshot of the issue in here.

Hi @Nirmal ,
Nice to see you interested in HiveMQ and MQTT sphere, please welcome to our community!

What is HiveMQ version that you are using? You can find the exact version if you check your hivemq.log file.

Kind regards,
Dasha from HiveMQ team

Hi @Daria_H ,

Currently I am using HiveMQ version 4.5.

Regards,
Nirmal

Thanks @Nirmal , you need to upgrade your HiveMQ version, please refer to the same topic here:Error in log - Serialization of Kafka metrics?

Thanks @Daria_H for the above update. May I know when is HiveMQ version 4.5 will have end of life?

Regards,
Nirmal

Sure, @Nirmal , it is HiveMQ 4.5 (End of life: 12-01-2023)

@Daria_H, the major doubt that I have with respect to this issue that I am not sure where this Maximum Allowed bytes are configured with. The exception states that 5242880 is the maximum allowed bytes. But I am not able to find where they are defined. Also I checked with the MAX_PACKET_SIZE value which was set with 256 mb. Can you provide me with details where the maximum message size mentioned in the above log is configured?

Also with respect to the end of life date, do you mean that the end of life date is 1st of December of year 2023?