-
We are currently using a Kafka Source which reads in protobuf like so kafka_metrics:
type: "kafka"
bootstrap_servers: "bootstrap.strimzi.svc.cluster.local:9093"
group_id: "metrics.prod"
topics:
- "^edge.*metrics.*"
sasl:
enabled: true
mechanism: SCRAM-SHA-512
username: ${SRC_KAFKA_SASL_USERNAME}
password: ${SRC_KAFKA_SASL_PASSWORD}
decoding:
codec: protobuf
protobuf:
desc_file: /opt/proto-store/streamingevent.desc
message_type: edge.StreamingProtoEventV1Value We now also want to encode the key and header in protobuf before sending to Kafka. Will Vector be able to read this? How can I include this? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
Hi @seilerre,
Vector will be able to decode everything based on the description file and message type you specify. So if you change your protos to include "key and header" then those will be part of the decoded Vector event. Let me know if I am missing something. |
Beta Was this translation helpful? Give feedback.
This worked great by doing:
and then