-
Notifications
You must be signed in to change notification settings - Fork 3.2k
Description
Component(s)
exporter/loadbalancing
What happened?
Description
Using the loadbalancing exporter configured with the k8s resolver to the FQDN of a OpenTelemetry Collector statefulset headless service, the exporter is continually logging the error message:
couldn't find the exporter for the endpoint ""This seems to be linked to #43960 and #43950.
If I configure the collector using the dns resolver and the same service FQDN the collector works as expected
Steps to Reproduce
Configure the collector using the loadbalancing exporter and the k8s resolver as below.
Expected Result
The opentelemetry collector exports to the collector with the given service FQDN.
Actual Result
Errors are continually logged as below.
Collector version
v0.140.0
Environment information
Environment
Running on k8s with image: https://hub.docker.com/layers/otel/opentelemetry-collector-k8s/0.140.0
OpenTelemetry Collector configuration
exporters:
loadbalancing:
protocol:
otlp:
timeout: 10s
tls:
ca_file: /etc/otel/tls/ca.crt
insecure: false
resolver:
k8s:
ports:
- 4317
return_hostnames: true
service: otelsampler.test.svc.cluster.local
timeout: 10s
routing_key: traceID
extensions:
basicauth/server:
htpasswd:
inline: |
${env:OTEL_USER}:${env:OTEL_PASS}
health_check:
endpoint: ${env:MY_POD_IP}:13133
processors:
memory_limiter:
check_interval: 1s
limit_percentage: 95
spike_limit_percentage: 5
receivers:
otlp:
protocols:
grpc:
auth: null
endpoint: ${env:MY_POD_IP}:4317
include_metadata: true
tls:
cert_file: /etc/otel/tls/tls.crt
key_file: /etc/otel/tls/tls.key
prometheus:
config:
scrape_configs:
- job_name: opentelemetry-collector
scrape_interval: 10s
static_configs:
- targets:
- ${env:MY_POD_IP}:8888
service:
extensions:
- basicauth/server
- health_check
pipelines:
traces:
exporters:
- loadbalancing
processors:
- memory_limiter
receivers:
- otlp
telemetry:
logs:
encoding: json
level: info
metrics:
readers:
- pull:
exporter:
prometheus:
host: ${env:MY_POD_IP}
port: 8888Log output
{"level":"error","ts":"2025-11-24T09:05:15.271Z","caller":"internal/base_exporter.go:114","msg":"Exporting failed. Rejecting data.","resource":{"service.instance.id":"941469d8-304f-4f77-a086-584f9cb6965a","service.name":"otelcol-k8s","service.version":"0.140.1"},"otelcol.component.id":"loadbalancing","otelcol.componen
t.kind":"exporter","otelcol.signal":"traces","error":"couldn't find the exporter for the endpoint \"\"","rejected_items":4,"stacktrace":"go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*BaseExporter).Send\n\tgo.opentelemetry.io/collector/exporter/[email protected]/internal/base_exporter.go:114\ng
o.opentelemetry.io/collector/exporter/exporterhelper/internal.NewTracesRequest.newConsumeTraces.func1\n\tgo.opentelemetry.io/collector/exporter/[email protected]/internal/new_request.go:123\ngo.opentelemetry.io/collector/consumer.ConsumeTracesFunc.ConsumeTraces\n\tgo.opentelemetry.io/collector/[email protected]/t
races.go:27\ngo.opentelemetry.io/collector/service/internal/refconsumer.refTraces.ConsumeTraces\n\tgo.opentelemetry.io/collector/[email protected]/internal/refconsumer/traces.go:29\ngo.opentelemetry.io/collector/processor/processorhelper.NewTraces.func1\n\tgo.opentelemetry.io/collector/processor/[email protected].
0/traces.go:71\ngo.opentelemetry.io/collector/consumer.ConsumeTracesFunc.ConsumeTraces\n\tgo.opentelemetry.io/collector/[email protected]/traces.go:27\ngo.opentelemetry.io/collector/service/internal/refconsumer.refTraces.ConsumeTraces\n\tgo.opentelemetry.io/collector/[email protected]/internal/refconsumer/traces.go:29\n
go.opentelemetry.io/collector/processor/processorhelper.NewTraces.func1\n\tgo.opentelemetry.io/collector/processor/[email protected]/traces.go:71\ngo.opentelemetry.io/collector/consumer.ConsumeTracesFunc.ConsumeTraces\n\tgo.opentelemetry.io/collector/[email protected]/traces.go:27\ngo.opentelemetry.io/collector/
service/internal/refconsumer.refTraces.ConsumeTraces\n\tgo.opentelemetry.io/collector/[email protected]/internal/refconsumer/traces.go:29\ngo.opentelemetry.io/collector/consumer.ConsumeTracesFunc.ConsumeTraces\n\tgo.opentelemetry.io/collector/[email protected]/traces.go:27\ngo.opentelemetry.io/collector/internal/fanoutc
onsumer.(*tracesConsumer).ConsumeTraces\n\tgo.opentelemetry.io/collector/internal/[email protected]/traces.go:60\ngo.opentelemetry.io/collector/receiver/otlpreceiver/internal/trace.(*Receiver).Export\n\tgo.opentelemetry.io/collector/receiver/[email protected]/internal/trace/otlp.go:42\ngo.opentelemetry.io/co
llector/pdata/ptrace/ptraceotlp.rawTracesServer.Export\n\tgo.opentelemetry.io/collector/[email protected]/ptrace/ptraceotlp/grpc.go:87\ngo.opentelemetry.io/collector/pdata/internal/otelgrpc.traceServiceExportHandler.func1\n\tgo.opentelemetry.io/collector/[email protected]/internal/otelgrpc/trace_service.go:72\ngo.opentelemetr
y.io/collector/config/configgrpc.(*ServerConfig).getGrpcServerOptions.enhanceWithClientInformation.func9\n\tgo.opentelemetry.io/collector/config/[email protected]/configgrpc.go:576\ngo.opentelemetry.io/collector/pdata/internal/otelgrpc.traceServiceExportHandler\n\tgo.opentelemetry.io/collector/[email protected]/internal
/otelgrpc/trace_service.go:74\ngoogle.golang.org/grpc.(*Server).processUnaryRPC\n\tgoogle.golang.org/[email protected]/server.go:1431\ngoogle.golang.org/grpc.(*Server).handleStream\n\tgoogle.golang.org/[email protected]/server.go:1842\ngoogle.golang.org/grpc.(*Server).serveStreams.func2.1\n\tgoogle.golang.org/[email protected]/serve
r.go:1061"}Additional context
No response
Tip
React with 👍 to help prioritize this issue. Please use comments to provide useful context, avoiding +1 or me too, to help us triage it. Learn more here.