Skip to content

Network issues in js lambda layer #1120

@kaskavalci

Description

@kaskavalci

name: Bug report
about: network issue in ADOT layer
title: ''
labels: bug
assignees: ''


Describe the bug
I'm trying to export logs to grafana via otlphttp exporter. I've noticed collector cannot make requests to endpoint specified.

Steps to reproduce

  1. Use "arn:aws:lambda:${var.aws_region}:901920570463:layer:aws-otel-nodejs-arm64-ver-1-30-2:1" layer
  2. Use otlphttp exporter
exporters:
  otlphttp:
    endpoint: "https://otlp-gateway-prod-eu-west-2.grafana.net/otlp"
    headers:
      authorization: "Basic xxx"
  1. Trigger the lambda

What did you expect to see?
Exporter should succeed with network request.

What did you see instead?

{"level":"info","ts":1756069616.3446298,"caller":"internal/retry_sender.go:126","msg":"Exporting failed. Will retry the request after interval.","kind":"exporter","data_type":"metrics","name":"otlphttp","error":"failed to make an HTTP request: Post \"https://otlp-gateway-prod-eu-west-2.g
rafana.net/otlp/v1/metrics\": context canceled","interval":"8.871688847s"}
{"level":"error","ts":1756069616.3443046,"caller":"internal/base_exporter.go:139","msg":"Exporting fa
iled. Rejecting data. Try enabling sending_queue to survive temporary failures.","kind":"exporter","data_type":"metrics","name":"otlphttp","error":"request is cancelled or timed out failed to make an HTTP request: Post \"https://otlp-gateway-prod-eu-west-2.grafana.net/otlp/v1/metrics\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while
 awaiting headers)","rejected_items":4,"stacktrace":"go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*BaseExporter).Send\n\tgo.opentelemetry.io/collector/[email protected]/exporterhelper/internal/base_exporter.go:139\ngo.opentelemetry.io/collector/exporter/exporterhelper.NewMetricsRequest.func1\n\tgo.opentelemetry.io/collector/[email protected]/exporterhel
per/metrics.go:135\ngo.opentelemetry.io/collector/consumer.ConsumeMetricsFunc.ConsumeMetrics\n\tgo.opentelemetry.io/collector/[email protected]/metrics.go:26\ngo.opentelemetry.io/collector/internal/fanoutconsumer.(*metricsConsumer).ConsumeMetrics\n\tgo.opentelemetry.io/collector/internal/[email protected]/metrics.go:71\ngo.opentelemetry.io/collector/consumer.ConsumeM
etricsFunc.ConsumeMetrics\n\tgo.opentelemetry.io/collector/[email protected]/metrics.go:26\ngo.opentelemetry.io/collector/receiver/otlpreceiver/internal/metrics.(*Receiver).Export\n\tgo.opentelemetry.io/collector/receiver/[email protected]/internal/metrics/otlp.go:41\ngo.opentelemetry.io/collector/receiver/otlpreceiver.handleMetrics\n\tgo.opentelemetry.io/collector/rec
eiver/[email protected]/otlphttp.go:76\ngo.opentelemetry.io/collector/receiver/otlpreceiver.(*otlpReceiver).startHTTPServer.func2\n\tgo.opentelemetry.io/collector/receiver/otlpreceive
[email protected]/otlp.go:146\nnet/http.HandlerFunc.ServeHTTP\n\tnet/http/server.go:2220\nnet/http.(*ServeMux).ServeHTTP\n\tnet/http/server.go:2747\ngo.opentelemetry.io/collector/config/confighttp.(*decompressor).ServeHTTP\n\tgo.opentelemetry.io/collector/config/[email protected]/compression.go:175\ngo.opentelemetry.io/collector/config/confighttp.(*ServerConfig).ToServer.maxRe
questBodySizeInterceptor.func2\n\tgo.opentelemetry.io/collector/config/[email protected]/confighttp.go:555\nnet/http.HandlerFunc.ServeHTTP\n\tnet/http/server.go:2220\ngo.opentelemetry.i
o/contrib/instrumentation/net/http/otelhttp.(*middleware).serveHTTP\n\tgo.opentelemetry.io/contrib/instrumentation/net/http/[email protected]/handler.go:171\ngo.opentelemetry.io/contrib/in
strumentation/net/http/otelhttp.NewMiddleware.func1.1\n\tgo.opentelemetry.io/contrib/instrumentation/net/http/[email protected]/handler.go:65\nnet/http.HandlerFunc.ServeHTTP\n\tnet/http/se
rver.go:2220\ngo.opentelemetry.io/collector/config/confighttp.(*clientInfoHandler).ServeHTTP\n\tgo.opentelemetry.io/collector/config/[email protected]/clientinfohandler.go:26\nnet/http.
serverHandler.ServeHTTP\n\tnet/http/server.go:3210\nnet/http.(*conn).serve\n\tnet/http/server.go:2092"}

What version of collector/language SDK version did you use?

  1. "@opentelemetry/api": "^1.9.0",
  2. "arn:aws:lambda:${var.aws_region}:901920570463:layer:aws-otel-nodejs-arm64-ver-1-30-2:1"

What language layer did you use?
nodejs

Additional context

  1. I've tried adding sending_queue, retry_on_failure but issue remains the same. Error comes after lambda terminates, when collector tries to send the metrics after some time.
  2. lambda has no VPC. Main lambda code can access to grafana.net URL. Network issue is only in ADOT layer.

i've seen a similar issue reported but went to stale. #271

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions