Skip to main content
MQTT Explorer uses Protocol Buffers (protobuf) for efficient binary serialization of internal IPC messages. This provides significant performance improvements over JSON serialization for high-throughput message passing between the renderer and main process.

Why Protocol Buffers?

Protocol Buffers offer several advantages for IPC:

Performance

Binary encoding is faster to serialize/deserialize than JSON

Size

Encoded messages are smaller, reducing memory overhead

Type Safety

Schema validation ensures message structure integrity

Backwards Compatibility

Field numbers allow schema evolution without breaking changes

Message Codec Implementation

The MessageCodec class provides binary encoding/decoding for MQTT messages:
import * as protobuf from 'protobufjs'

// Define message schema
const messageSchema = {
  nested: {
    mqtt: {
      nested: {
        Envelope: {
          fields: {
            topic: { type: 'string', id: 1 },
            payload: { type: 'bytes', id: 2 },
          },
        },
      },
    },
  },
}

// Create root from JSON schema
const root = protobuf.Root.fromJSON(messageSchema)
const Envelope = root.lookupType('mqtt.Envelope')

Schema Structure

The protobuf schema defines an Envelope message with two fields:
FieldTypeIDDescription
topicstring1MQTT topic string
payloadbytes2Binary payload data
Field IDs (1, 2) are permanent identifiers. Never reuse or change IDs to maintain backwards compatibility.

MessageCodec API

The MessageCodec class provides three static methods:

encode(topic, data)

Serializes a message to binary format:
public static encode(topic: string, data: any): Uint8Array {
  // Serialize the payload to JSON, then to bytes
  const jsonString = JSON.stringify(data)
  const payloadBytes = new TextEncoder().encode(jsonString)

  // Create protobuf envelope
  const message = Envelope.create({
    topic,
    payload: payloadBytes,
  })

  // Encode to binary
  return Envelope.encode(message).finish()
}
Example:
const binary = MessageCodec.encode('home/temperature', { value: 22.5 })
// Returns: Uint8Array [10, 16, 104, 111, 109, 101, ...]

decode(binary)

Decodes a binary message:
public static decode(binary: Uint8Array): BinaryMessage {
  const message = Envelope.decode(binary) as any
  return {
    topic: message.topic,
    payload: message.payload,
  }
}
Example:
const { topic, payload } = MessageCodec.decode(binary)
// topic: 'home/temperature'
// payload: Uint8Array [123, 34, 118, ...]

decodeWithPayload<T>(binary)

Decodes and parses the payload as JSON:
public static decodeWithPayload<T>(binary: Uint8Array): { 
  topic: string
  data: T 
} {
  const { topic, payload } = this.decode(binary)
  const jsonString = new TextDecoder().decode(payload)
  const data = JSON.parse(jsonString)
  return { topic, data }
}
Example:
interface TempData {
  value: number
}

const { topic, data } = MessageCodec.decodeWithPayload<TempData>(binary)
// topic: 'home/temperature'
// data: { value: 22.5 }

Usage Pattern

The MessageCodec is used for IPC between Electron processes:
1

Renderer encodes message

const binary = MessageCodec.encode('sensors/motion', {
  detected: true,
  timestamp: Date.now()
})
2

Send via IPC channel

ipcRenderer.send('mqtt-message', binary)
3

Main process decodes

ipcMain.on('mqtt-message', (event, binary) => {
  const { topic, data } = MessageCodec.decodeWithPayload(binary)
  // Process the decoded message
})

Performance Characteristics

Encoding Benchmark

const start = performance.now()
for (let i = 0; i < 10000; i++) {
  const json = JSON.stringify({ topic, payload: data })
  const bytes = new TextEncoder().encode(json)
}
const duration = performance.now() - start
// ~150ms for 10,000 iterations

Size Comparison

For a typical MQTT message:
{
  "topic": "home/livingroom/temperature",
  "payload": {"value": 22.5, "unit": "celsius"}
}
FormatSizeReduction
JSON89 bytes-
Protobuf61 bytes31% smaller
Size savings increase with larger payloads and more structured data.

Type Definitions

The module exports TypeScript interfaces:
export interface BinaryMessage {
  topic: string
  payload: Uint8Array
}

export class MessageCodec {
  public static encode(topic: string, data: any): Uint8Array
  public static decode(binary: Uint8Array): BinaryMessage
  public static decodeWithPayload<T>(binary: Uint8Array): { 
    topic: string
    data: T 
  }
}

Protobuf vs Sparkplug

MQTT Explorer uses Protocol Buffers in two different contexts:
ContextPurposeSchemaLibrary
IPC MessagesInternal communicationCustom (Envelope)protobufjs
Sparkplug BMQTT payload decodingSparkplug specsparkplug-payload
Different Use Cases
  • IPC protobuf is for internal performance optimization
  • Sparkplug protobuf is for industrial IoT protocol support
These are separate systems and not interchangeable.

Advanced: Schema Evolution

Protocol Buffers support backwards-compatible schema changes:

Safe Changes

Add Optional Fields

New fields with higher IDs are ignored by old decoders
fields: {
  topic: { type: 'string', id: 1 },
  payload: { type: 'bytes', id: 2 },
  qos: { type: 'int32', id: 3 }  // New!
}

Add New Message Types

Extend the schema without affecting existing messages
nested: {
  mqtt: {
    nested: {
      Envelope: { /* ... */ },
      Statistics: { /* ... */ }  // New!
    }
  }
}

Unsafe Changes

Never do these:
  • Change field IDs (breaks decoding)
  • Change field types (corrupts data)
  • Remove required fields (breaks old encoders)
  • Reuse field IDs (causes confusion)

Error Handling

The MessageCodec doesn’t explicitly throw errors, but decoding can fail:
try {
  const { topic, data } = MessageCodec.decodeWithPayload(binary)
  console.log('Decoded:', topic, data)
} catch (error) {
  if (error.message.includes('invalid wire type')) {
    console.error('Corrupted protobuf data')
  } else if (error.message.includes('Unexpected token')) {
    console.error('Invalid JSON in payload')
  } else {
    console.error('Decode error:', error)
  }
}

Common Errors

The binary data is not valid Protocol Buffers:
  • Check that you’re decoding data encoded with the same schema
  • Verify the binary wasn’t corrupted during transmission
  • Ensure you’re not trying to decode JSON as protobuf
The payload contains invalid JSON:
  • The encoder expects JSON-serializable data
  • Check for circular references in the data object
  • Verify the data isn’t undefined or a function
The protobuf schema wasn’t loaded correctly:
  • Ensure protobufjs is installed: npm install protobufjs
  • Verify the schema definition is valid JSON
  • Check that Root.fromJSON() succeeded

Dependencies

The MessageCodec requires:
{
  "dependencies": {
    "protobufjs": "8.0.0"
  }
}
protobufjs is a pure JavaScript implementation of Protocol Buffers with no native dependencies.

Using with Custom Decoders

You can use protobuf schemas in custom MQTT message decoders:
1

Define your schema

const sensorSchema = {
  nested: {
    sensors: {
      nested: {
        Reading: {
          fields: {
            sensor_id: { type: 'string', id: 1 },
            value: { type: 'double', id: 2 },
            timestamp: { type: 'int64', id: 3 }
          }
        }
      }
    }
  }
}

const root = protobuf.Root.fromJSON(sensorSchema)
const Reading = root.lookupType('sensors.Reading')
2

Create decoder

export const ProtobufSensorDecoder: MessageDecoder = {
  formats: ['sensor-protobuf'],
  
  canDecodeTopic(topic: string) {
    return topic.startsWith('sensors/')
  },
  
  decode(input: Base64Message) {
    try {
      const buffer = input.toBuffer()
      const decoded = Reading.decode(buffer)
      const json = Reading.toObject(decoded)
      
      return {
        message: Base64Message.fromString(JSON.stringify(json, null, 2)),
        decoder: Decoder.NONE
      }
    } catch (error) {
      return {
        error: `Protobuf decode failed: ${error.message}`,
        decoder: Decoder.NONE
      }
    }
  }
}
3

Register and use

Add to the decoders array and it will automatically decode matching topics.

Loading .proto Files

For complex schemas, use .proto files instead of JSON:
import * as protobuf from 'protobufjs'
import * as path from 'path'

// Load from .proto file
const root = await protobuf.load(path.join(__dirname, 'schema.proto'))
const MessageType = root.lookupType('package.MessageType')

// Use the same way as JSON schemas
const binary = MessageType.encode({ field: 'value' }).finish()
Example .proto file:
syntax = "proto3";

package mqtt;

message Envelope {
  string topic = 1;
  bytes payload = 2;
  int32 qos = 3;
  bool retained = 4;
}
MQTT Explorer uses JSON schemas for simplicity, but .proto files offer better tooling support.

Best Practices

Cache Compiled Types

Compile protobuf types once at startup, not per message

Validate Before Encoding

Use Type.verify() to catch invalid data before encoding

Handle Partial Data

Protobuf decodes partial messages - validate completeness

Document Field Numbers

Comment what each field ID represents for maintainability

Use Type Guards

Add TypeScript type guards for decoded message validation

Profile Performance

Measure encoding/decoding time for your specific use case

See Also