kube-prometheus and self-managed Gitlab #1075
Unanswered
w3irdrobot
asked this question in
Q&A
Replies: 1 comment
-
Yes, there were older version of Kubernetes that exposed metrics as container_name and pod_name, which has been changed since. I believe the breaking change was from Kubernetes v1.14+. It might be worth checking an older version of kube-prometheus for either a out of the box experience or a way to do the relabelling. Otherwise updating your Kubernetes should be nice too. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I manage a self-managed Gitlab install. We are trying to use
kube-prometheus
, which is wonderful btw, as the place Gitlab can point to to show metrics in the UI when installing in environments and such. There is good documentation on all this. I havekube-prometheus
set up and data populating in Prometheus. However, some of the labels are named slightly differently than what Gitlab is expecting for its integrations.Specifically I'm talking about the Kubernetes metrics it can pull. It expects there to be labels called
container_name
andpod_name
. However, the Prometheus Operator relabels them ascontainer
andpod
instead, which is causing the queries Gitlab is expecting to return empty results. I've been just testing these in the Prom UI for now. If I change the labels in their queries to the correct names of the labels in the Prom UI, the queries work great.This lead me into attempting relabeling in one of my
ServiceMonitor
s as such:Looking at the Target screen, I can see that after this, both
container_name
andcontainer
exist. Same for pods. However, when I query, I can still see that onlycontainer
is available for querying on.This may just be my inexperience with Prometheus, but does anyone know what might be going on here?
Beta Was this translation helpful? Give feedback.
All reactions