You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have this configuration, but I defined the wrong Kafka service. As a result, Vector updated the checkpoint, and the current logs inside log.log have been read from the source section. Now, when I define the valid Kafka server, the previous entries won't be sent to the Kafka server because they have already been checkpointed. I want the log entries to be checkpointed only after they are successfully sent to Kafka.
I think the solution is related to using disk buffering, and I need to define a disk buffer. How do I specify the path for the buffers in the configuration?"
customConfig:
api:
enabled: trueaddress: 127.0.0.1:8686playground: falsesources:
file_logs:
type: "file"include:
- "/vector-data-dir/log.log"read_from: "beginning"data_dir: /vector-data-dir/transforms:
app_logs_parser:
inputs:
- "file_logs"type: "remap"source: | # Try to parse the JSON message . = parse_json(.message) ?? null # Check if parsing was successful if is_null(.) { abort # unknown type } # Handle two formats of logs: flat and nested log_value = if exists(.log) { .log } else { . } # Merge log_value into the main object and add timestamp . = log_valuesinks:
kafka_server:
type: "kafka"inputs:
- "app_logs_parser"encoding:
codec: "json"bootstrap_servers: kafka.vector.svc.cluster.local:9092topic: test
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I have this configuration, but I defined the wrong Kafka service. As a result, Vector updated the checkpoint, and the current logs inside log.log have been read from the source section. Now, when I define the valid Kafka server, the previous entries won't be sent to the Kafka server because they have already been checkpointed. I want the log entries to be checkpointed only after they are successfully sent to Kafka.
I think the solution is related to using disk buffering, and I need to define a disk buffer. How do I specify the path for the buffers in the configuration?"
Beta Was this translation helpful? Give feedback.
All reactions