Skip to content

Conversation

zakisk
Copy link
Contributor

@zakisk zakisk commented Oct 12, 2025

Ensures a consistent context key is used for all GitLab commit statuses, allowing failures from invalid PipelineRuns to be correctly overwritten on subsequent runs.

The Problem:

When a repository contained multiple PipelineRun definitions and one was invalid (e.g., due to a validation error), the initial commit would correctly report a "failed" status to GitLab for that invalid run. However, the context key for this failure was generic (e.g., "ApplicationName") because a full PipelineRun wasn't provided.

After a developer fixed the invalid PipelineRun and pushed a new commit, the now-successful run would post its status with a more specific context key (e.g., "ApplicationName/PipelineRunName").

Because the GitLab API does not allow deleting commit pipeline jobs, the original, generic "failed" status remained on the commit forever. This resulted in the overall commit pipeline being permanently and incorrectly marked as "failed," even though all pipelines eventually succeeded.

The Solution:

this commit provides PipelineRun name from "OriginalPRName" annotation so that contextKey is uniform for "success" PipelineRun and failed PipelineRuns due to validation.

Before

image

After

image

After Fixing the PipelineRun syntax error here is how the status is update with correct job name

image

📝 Description of the Change

👨🏻‍ Linked Jira

https://issues.redhat.com/browse/SRVKP-9044

🔗 Linked GitHub Issue

Fixes #

🚀 Type of Change

  • 🐛 Bug fix (fix:)
  • ✨ New feature (feat:)
  • 💥 Breaking change (feat!:, fix!:)
  • 📚 Documentation update (docs:)
  • ⚙️ Chore (chore:)
  • 💅 Refactor (refactor:)
  • 🔧 Enhancement (enhance:)
  • 📦 Dependency update (deps:)

🧪 Testing Strategy

  • Unit tests
  • Integration tests
  • End-to-end tests
  • Manual testing
  • Not Applicable

🤖 AI Assistance

  • I have not used any AI assistance for this PR.
  • I have used AI assistance for this PR.

If you have used AI assistance, please provide the following details:

Which LLM was used?

  • GitHub Copilot
  • ChatGPT (OpenAI)
  • Claude (Anthropic)
  • Cursor
  • Gemini (Google)
  • Other: ____________

Extent of AI Assistance:

  • Documentation and research only
  • Unit tests or E2E tests only
  • Code generation (parts of the code)
  • Full code generation (most of the PR)
  • PR description and comments
  • Commit message(s)

Important

If the majority of the code in this PR was generated by an AI, please add a Co-authored-by trailer to your commit message.
For example:

Co-authored-by: Gemini [email protected]
Co-authored-by: ChatGPT [email protected]
Co-authored-by: Claude [email protected]
Co-authored-by: Cursor [email protected]
Co-authored-by: Copilot [email protected]

**💡You can use the script ./hack/add-llm-coauthor.sh to automatically add
these co-author trailers to your commits.

✅ Submitter Checklist

  • 📝 My commit messages are clear, informative, and follow the project's How to write a git commit message guide. The Gitlint linter ensures in CI it's properly validated
  • ✨ I have ensured my commit message prefix (e.g., fix:, feat:) matches the "Type of Change" I selected above.
  • ♽ I have run make test and make lint locally to check for and fix any
    issues. For an efficient workflow, I have considered installing
    pre-commit and running pre-commit install to
    automate these checks.
  • 📖 I have added or updated documentation for any user-facing changes.
  • 🧪 I have added sufficient unit tests for my code changes.
  • 🎁 I have added end-to-end tests where feasible. See README for more details.
  • 🔎 I have addressed any CI test flakiness or provided a clear reason to bypass it.
  • If adding a provider feature, I have filled in the following and updated the provider documentation:
    • GitHub App
    • GitHub Webhook
    • Gitea/Forgejo
    • GitLab
    • Bitbucket Cloud
    • Bitbucket Data Center

Copy link

Summary of Changes

Hello @zakisk, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request resolves a critical problem where GitLab commit statuses would incorrectly remain marked as 'failed' due to inconsistent context keys. Previously, initial validation failures used a generic context key, while subsequent successful runs used a more specific one, preventing the successful status from overwriting the original failure. The change introduces the use of the 'OriginalPRName' annotation to provide a uniform PipelineRun name for the context key, allowing successful PipelineRuns to correctly update the commit status and reflect the true state of the pipeline.

Highlights

  • Stale GitLab Statuses: Addresses an issue where GitLab commit statuses could remain in a 'failed' state indefinitely, even after the underlying PipelineRuns were fixed and succeeded.
  • Consistent Context Keys: Ensures a consistent context key is used for all GitLab commit statuses, which is crucial for correctly overwriting previous failure statuses.
  • PipelineRun Name from Annotation: The PipelineRun name is now provided from the 'OriginalPRName' annotation, ensuring uniformity in the context key for both successful and validation-failed PipelineRuns.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request addresses an issue where stale failure statuses would persist on GitLab commits due to inconsistent context keys. The change ensures that when a PipelineRun fails to be created (e.g., due to a validation error), the failure status is created with a consistent context key derived from the OriginalPRName annotation. This allows subsequent successful runs to correctly overwrite the status. The implementation is sound and correctly solves the described problem. I have no further comments.

@zakisk zakisk force-pushed the SRVKP-9044-fix-gitlab-plr-status branch 4 times, most recently from 077d55f to e325263 Compare October 12, 2025 15:52
}

func (o *OpenshiftConsole) URL() string {
if o.host == "" {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if PaC is not set with any Console, it was also causing GitLab API an error "{target_url: [is blocked: URI is invalid]}}"

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please instead add a func (o *OpenshiftConsole) Host() getter which defaults to openshift.url.is.not.configured when o.host is not set, so we're not duplicating this fallback logic in every function

@zakisk
Copy link
Contributor Author

zakisk commented Oct 13, 2025

/test

Copy link

pipelines-as-code bot commented Oct 13, 2025

🔍 PR Lint Feedback

Note: This automated check helps ensure your PR follows our contribution guidelines.

⚠️ Items that need attention:

🤖 AI attribution

The following commits lack an explicit AI attribution footer:

  • a77f07f fix: prevent stale failure statuses on commits

If no AI assistance was used for a commit, you can ignore this warning.
Otherwise add an Assisted-by: or Co-authored-by: footer referencing the AI used.


ℹ️ Next Steps

  • Review and address the items above
  • Push new commits to update this PR
  • This comment will be automatically updated when issues are resolved
🔧 Admin Tools (click to expand)

Automated Issue/Ticket Creation:

  • /issue-create - Generate a GitHub issue from this PR content using AI
  • /jira-create - Create a SRVKP Jira ticket from this PR content using AI

⚠️ Important: Always review and edit generated content before finalizing tickets/issues.
The AI-generated content should be used as a starting point and may need adjustments.

These commands are available to maintainers and will post the generated content as PR comments for review.

🤖 This feedback was generated automatically by the PR CI system

@chmouel
Copy link
Member

chmouel commented Oct 13, 2025

can we have e2e test please

@zakisk
Copy link
Contributor Author

zakisk commented Oct 13, 2025

can we have e2e test please

I thought about it but there is no way to confirm that how job are posted because GitLab doesn't provide API endpoint to check jobs in a commit status pipeline

@chmouel
Copy link
Member

chmouel commented Oct 13, 2025

are you sure? that surprises me, gemini says otherwise as well:

Here's a breakdown of what the GitLab API does provide:

Pipelines API: You can list pipelines for a project and filter them by sha (commit hash) or ref (branch/tag). Once you have the pipeline ID, you can get detailed information about that specific pipeline, including its overall status (e.g., success, failed, running, pending).

Endpoint example: GET /projects/:id/pipelines?sha=<commit_sha>

Jobs API: Once you have a pipeline ID (which you can get from the Pipelines API using the commit SHA), you can use the Jobs API to list all jobs in that pipeline and check their individual status.

Endpoint example: GET /projects/:id/pipelines/:pipeline_id/jobs

Commit Statuses API: This API specifically allows you to set and list the statuses associated with a specific commit SHA. This is what GitLab's CI/CD uses to show the status badges (the little colored icons) on the merge request.

Endpoint example: GET /projects/:id/repository/commits/:sha/statuses

The roadblock the user zakisk mentioned—"there is no way to confirm that how job are posted because GitLab doesn't provide API endpoint to check jobs in a commit status pipeline"—likely refers to a difficulty in reliably linking an external e2e test job's status back to a specific set of automatically created jobs within a single, unified pipeline status or perhaps the challenge of getting a list of only the jobs created as part of the commit's status without the context of a pipeline ID.

However, by chaining API calls (getting the pipeline by SHA, then getting the jobs for that pipeline ID), you can generally retrieve the status of all CI jobs run for a given commit.

@zakisk
Copy link
Contributor Author

zakisk commented Oct 13, 2025

Commit Statuses API: This API specifically allows you to set and list the statuses associated with a specific commit SHA. This is what GitLab's CI/CD uses to show the status badges (the little colored icons) on the merge request.

Endpoint example: GET /projects/:id/repository/commits/:sha/statuses

yeah, I was kinda obsessed with my previous findings and missed this. we can do it using commit status api

Ensures a consistent context key is used for all GitLab commit statuses,
allowing failures from invalid PipelineRuns to be correctly overwritten
on subsequent runs.

The Problem:
When a repository contained multiple PipelineRun definitions and one was
invalid (e.g., due to a validation error), the initial commit would
correctly report a "failed" status to GitLab for that invalid run. However,
the context key for this failure was generic  (e.g., "ApplicationName")
because a full PipelineRun wasn't provided.

After a developer fixed the invalid PipelineRun and pushed a new commit,
the now-successful run would post its status with a more specific
context key (e.g., "ApplicationName/PipelineRunName").

Because the GitLab API does not allow deleting commit pipeline jobs,
the original, generic "failed" status remained on the commit forever.
This resulted in the overall commit pipeline being permanently and
incorrectly marked as "failed," even though all pipelines eventually
succeeded.

The Solution:
this commit provides PipelineRun name from "OriginalPRName" annotation
so that contextKey is uniform for "success" PipelineRun and failed
PipelineRuns due to validation.

https://issues.redhat.com/browse/SRVKP-9044

Signed-off-by: Zaki Shaikh <[email protected]>
@zakisk zakisk force-pushed the SRVKP-9044-fix-gitlab-plr-status branch from e325263 to a77f07f Compare October 13, 2025 12:00
@zakisk
Copy link
Contributor Author

zakisk commented Oct 13, 2025

@chmouel added E2E test

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Development

Successfully merging this pull request may close these issues.

3 participants