-
QuestionI have a simple node HTTP server, that receives log messages and saves to a redis server, which is then readby logstash and fed into elasticsearch for visualization using Kibana. I am trying to migrate to Signoz. I have configured my node server to send the logs to vector I have added an The data that is received by Vector is in the following format {
"@timestamp": "2025-09-01T03:02:37.000Z",
"@version": 1,
"client_ip": "4.4.4.4",
"domain": "example.com",
"http_method": "GET",
"http_user_agent": "Mozilla/5.0 (Linux; Android 9; RMX1945) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/71.0.3578.99 Mobile Safari/537.36",
"log_level": "Error",
"log_transport": "beacon",
"log_type": "client_log",
"message": "Script error. 0",
"path": "/",
"request_uri": "https://www.example.com/test.html",
"source_type": "http_server",
"timestamp": "2025-09-01T03:02:23.978235612Z",
"type": "log"
} The data that is POSTed to the OTEL end point is {
"attributes.http.request.method":"GET",
"attributes.http.user_agent":"Mozilla/5.0 (Linux; Android 9; RMX1945) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/71.0.3578.99 Mobile Safari/537.36",
"attributes.log.transport":"beacon",
"body":"Script error. 0",
"resource.client.ip":"4.4.4.4",
"resource.domain":"example.com",
"resource.log.type":"client_log",
"resource.service.name":"example.com",
"resource.url.full":"https://www.example.com/",
"severityNumber":17,
"severityText":"ERROR",
"source_type":"http_server",
"timestamp":1756729880.552024
} Isn't the sink supposed to format it into Vector Configapi:
enabled: true
address: 0.0.0.0:8686
sources:
client_logs:
type: http_server
address: 0.0.0.0:8687
decoding:
codec: json
framing:
method: newline_delimited
sinks:
signoz_otlp:
type: opentelemetry
inputs:
- otel_fields
protocol:
type: http
uri: http://signoz-otel-collector:4318/v1/logs
method: post
encoding:
codec: json
framing:
method: newline_delimited
batch:
max_events: 1
request:
headers:
content-type: application/json
console:
inputs:
- otel_fields
target: stdout
type: console
encoding:
codec: json
transforms:
normalize_level:
type: remap
inputs: [client_logs]
source: |
lvl = to_string(.log_level) ?? "INFO"
lower = downcase(lvl)
.severityText = "INFO"
.severityNumber = 9
if lower == "trace" || lower == "trc" {
.severityText = "TRACE"
.severityNumber = 1
} else if lower == "debug" || lower == "dbg" {
.severityText = "DEBUG"
.severityNumber = 5
} else if lower == "info" || lower == "information" || lower == "log" {
.severityText = "INFO"
.severityNumber = 9
} else if lower == "warn" || lower == "warning" {
.severityText = "WARN"
.severityNumber = 13
} else if lower == "error" || lower == "err" || lower == "exception" || lower == "exceptions" {
.severityText = "ERROR"
.severityNumber = 17
} else if lower == "fatal" || lower == "critical" || lower == "crit" {
.severityText = "FATAL"
.severityNumber = 21
}
otel_fields:
type: remap
inputs: [normalize_level]
source: |
.timestamp = to_float(parse_timestamp!(.timestamp, "%Y-%m-%dT%H:%M:%S.%9fZ"))
# ----- Resource attributes (prefix resource.) -----
service_name = to_string(.domain) ?? "client"
."resource.service.name" = service_name
if exists(.log_type) {
."resource.log.type" = to_string(.log_type) ?? "client_log"
}
if exists(.client_ip) {
."resource.client.ip" = to_string(.client_ip) ?? ""
}
if exists(.request_uri) {
."resource.url.full" = to_string(.request_uri) ?? ""
}
if exists(.domain) {
."resource.domain" = to_string(.domain) ?? ""
}
# ----- Log attributes (prefix attributes.) -----
if exists(.http_user_agent) {
."attributes.http.user_agent" = to_string(.http_user_agent) ?? ""
}
if exists(.http_method) {
."attributes.http.request.method" = to_string(.http_method) ?? ""
}
if exists(.log_transport) {
."attributes.log.transport" = to_string(.log_transport) ?? ""
}
# Hoist any context_* fields into attributes.context.*
ctx_pairs = []
for_each(object(.)) -> |k, v| {
if starts_with(k, "context_") {
ctx_pairs = push(ctx_pairs, {"key": replace(k,"context_",""), "value": to_string(v) ?? encode_json(v)})
}
}
if length(ctx_pairs) > 0 {
.attributes."context.kv" = encode_json(ctx_pairs)
}
# Body as string
.body = to_string(.message) ?? encode_json(.message)
del(."@timestamp")
del(."@version")
del(.client_ip)
del(.domain)
del(.http_method)
del(.http_user_agent)
del(.request_uri)
del(.path)
del(.message)
del(.log_level)
del(.log_transport)
del(.log_type)
del(.type)
Vector LogsNo response |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
Hi @JoyceBabu,
That is correct. By default, data is mapped to a Vector format.
The bad news: The sink doesn't provide such encoding capabilities (for now, see #22696). The good news: I was working on this recently. This should be possible with the next release or if you are eager to try it, you can use the next nightly that will include #23524. If I understand correctly, your logs are already in OTLP format. Then you can use an (In the future, we might be able to skip the last step as well. We are actively working on improving this area.) |
Beta Was this translation helpful? Give feedback.
Hi @JoyceBabu,
That is correct. By default, data is mapped to a Vector format.
The bad news:
The sink doesn't provide such encoding capabilities (for now, see #22696).
The good news:
I was working on this recently. This should be possible with the next release or if you are eager to try it, you can use the next nightly that will include #23524.
If I understand correctly, your logs are already in OTLP format. Then you can use an
opentelemetry
source, enable OTLP decoding and finally in the sink, use protobu…