-
I'm writing a transformation, As an example, I have the following so far: [transforms.extract-entity-count]
inputs = ["parse_graph_node_logs"]
type = "remap"
source = """
final_fields = {}
final_fields.subgraph_id = .subgraph_id
final_fields.unix_timestamp = .unix_timestamp
final_fields.cluster_name = .cluster_name
# Extract the entity count from the message
regexed, err = parse_regex(.message, r'^.*entities: (?P<entity_count>\\d+),')
if err == null {
final_fields.entity_count = parse_int!(regexed.entity_count)
} else {
# if we didn't extract an entity count from this message, then set all fields to null, there's nothing to write to postgres
final_fields = {}
. = {}
}
. |= final_fields
"""
[sinks.stdout-entity-count]
inputs = ["extract-entity-count"]
type = "console"
encoding.codec = "json"
target = "stdout" When I run this I see many log lines from vector which simply log Here's a sample what I see when I run the above
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Oh, I think I found the answer. I think using [transforms.extract-entity-count]
inputs = ["parse_graph_node_logs"]
type = "remap"
drop_on_abort = true
source = """
final_fields = {}
final_fields.subgraph_id = .subgraph_id
final_fields.unix_timestamp = .unix_timestamp
final_fields.cluster_name = .cluster_name
# Extract the entity count from the message
regexed, err = parse_regex(.message, r'^.*entities: (?P<entity_count>\\d+),')
if err == null {
final_fields.entity_count = parse_int!(regexed.entity_count)
. |= final_fields
} else {
# if we didn't extract an entity count from this message abort this pipeline (note `drop_on_abort = true` for this transform)
abort
}
""" |
Beta Was this translation helpful? Give feedback.
Oh, I think I found the answer. I think using
abort
anddrop_on_abort = true
is the key like so: