Skip to content

Conversation

@anwesha-palit-redhat
Copy link
Contributor

Description of the Problem

The customer experiences confusion and incorrect task configuration when editing pipelines through the Pipeline Builder UI.
Because the side panel displays stale or incorrect parameter data for tasks that share the same name across namespaces, users may unknowingly save or deploy pipelines with invalid configurations.
This leads to pipeline failures and loss of confidence in the UI’s reliability.


Workaround

A partial workaround exists:
Users can manually verify and edit the Pipeline YAML to ensure it references the correct task and parameters from the intended namespace.
However, this requires advanced Tekton knowledge and defeats the purpose of the visual builder — making it unsuitable for most users.


Versions

  • OCP Version: 4.16
  • Pipelines Operator Version: 1.17, 1.18

Steps to Reproduce

  1. Create a task with a name identical to an existing task in the openshift-pipelines namespace.
  2. Using the Pipeline Builder UI, create a new pipeline and add the task from the openshift-pipelines namespace.
  3. Observe that the Pipeline Builder side panel shows parameters from the task in the current namespace instead of the selected one.
  4. Check the YAML editor — it correctly references the task from the cluster resolver and displays the accurate parameter data.

Reproducibility

  • Yes, reproducible Always

Customer Impact

  • Customer Name: Jeffrey Luckett
  • Revenue Impact: Medium — misconfigured pipelines can lead to failed runs and CI/CD delays, increasing support overhead and reducing user trust in the console plugin UI.

Root Cause

In the previous implementation, installation of Artifact Hub tasks failed because tasks with the same names already existed in the cluster resolver or namespace.
As a result:

  • The data displayed in the Pipeline Builder form became stale.
  • When tasks were removed from the Pipeline Builder, the actual cluster task was not uninstalled.
  • Upon reinstalling a task with the same name, the old (stale) data persisted.

Fix Implemented

  • Added a check for existing tasks with the same name before installation.

  • If a conflict is found, a safe name is generated and used for creating a new task.

  • During task removal, the system:

    • Checks if a task with the same name exists.
    • If yes, uninstalls it to maintain data hygiene.
    • If not, it updates the React state accordingly.

This ensures consistent task data and prevents stale or incorrect parameter references.


Screen Recordings

(Since GitHub upload size limits apply, videos are uploaded to Google Drive)

@vikram-raj vikram-raj changed the title fix: [SRVKP-8998] Pipeline Builder UI shows stale parameter data for tasks with same name across namespaces SRVKP-8998: fix Pipeline Builder UI shows stale parameter data for tasks with same name across namespaces Oct 22, 2025
@openshift-ci-robot
Copy link
Collaborator

openshift-ci-robot commented Oct 22, 2025

@anwesha-palit-redhat: This pull request references SRVKP-8998 which is a valid jira issue.

Warning: The referenced jira issue has an invalid target version for the target branch this PR targets: expected the bug to target the "4.21.0" version, but no target version was set.

In response to this:

Description of the Problem

The customer experiences confusion and incorrect task configuration when editing pipelines through the Pipeline Builder UI.
Because the side panel displays stale or incorrect parameter data for tasks that share the same name across namespaces, users may unknowingly save or deploy pipelines with invalid configurations.
This leads to pipeline failures and loss of confidence in the UI’s reliability.


Workaround

A partial workaround exists:
Users can manually verify and edit the Pipeline YAML to ensure it references the correct task and parameters from the intended namespace.
However, this requires advanced Tekton knowledge and defeats the purpose of the visual builder — making it unsuitable for most users.


Versions

  • OCP Version: 4.16
  • Pipelines Operator Version: 1.17, 1.18

Steps to Reproduce

  1. Create a task with a name identical to an existing task in the openshift-pipelines namespace.
  2. Using the Pipeline Builder UI, create a new pipeline and add the task from the openshift-pipelines namespace.
  3. Observe that the Pipeline Builder side panel shows parameters from the task in the current namespace instead of the selected one.
  4. Check the YAML editor — it correctly references the task from the cluster resolver and displays the accurate parameter data.

Reproducibility

  • Yes, reproducible Always

Customer Impact

  • Customer Name: Jeffrey Luckett
  • Revenue Impact: Medium — misconfigured pipelines can lead to failed runs and CI/CD delays, increasing support overhead and reducing user trust in the console plugin UI.

Root Cause

In the previous implementation, installation of Artifact Hub tasks failed because tasks with the same names already existed in the cluster resolver or namespace.
As a result:

  • The data displayed in the Pipeline Builder form became stale.
  • When tasks were removed from the Pipeline Builder, the actual cluster task was not uninstalled.
  • Upon reinstalling a task with the same name, the old (stale) data persisted.

Fix Implemented

  • Added a check for existing tasks with the same name before installation.

  • If a conflict is found, a safe name is generated and used for creating a new task.

  • During task removal, the system:

  • Checks if a task with the same name exists.

  • If yes, uninstalls it to maintain data hygiene.

  • If not, it updates the React state accordingly.

This ensures consistent task data and prevents stale or incorrect parameter references.


Screen Recordings

(Since GitHub upload size limits apply, videos are uploaded to Google Drive)

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the openshift-eng/jira-lifecycle-plugin repository.

export const fetchArtifactHubTasks = async (
query: string,
limit: number = 20,
limit = 20,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lets keep the type

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was a lint error, due to auto inference, so will switch it back

taskExists = true;
} catch (error) {
console.warn(
`Error fetching Task as task does not exist ${taskName}, so safely deleting from PipelineBuilder`,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lets the the RC if its really 404

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the whole onRemove() should be a callback... if anything then for a better readability

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done, thanks

);
}
if (taskExists) {
await k8sDelete({
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This risks deleting a user-managed Task that just happens to share the name. Shouldnt we delete Tasks that the builder installed in this flow, checking TektonTaskAnnotation.installedFrom ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That’s a good point — Using installedFrom limits cleanup to builder-installed tasks, but users may also install or update tasks via CLI / YAML. In this case, they were installing tasks using CLI and then adding them via the builder.

Therefore, I implemented this as I felt a more general approach might be safer long term — one that ensures consistent cleanup while still protecting user-managed resources.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated it according to what is suggested

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated according to what @vikram-raj suggested, i.e, revert to the original version and not to delete the tasks, only update the UI builder

useCleanupOnFailure(failedTasks, onUpdateTasks, taskGroup);

// Get all existing task names from taskGroup and installed tasks
const getExistingTaskNames = (): string[] => {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the current implementation does not count with duplicated names. I would suggest something like:

  const taskNames = new Set<string>();

  // Add all tasks currently in the builder
  [
    ...taskGroup.tasks,
    ...taskGroup.finallyTasks,
    ...taskGroup.listTasks,
    ...taskGroup.loadingTasks,
    ...taskGroup.finallyListTasks,
  ].forEach((t) => {
    if (t?.name) taskNames.add(t.name);
  });

  // Add installed catalog items (avoid duplicates)
  catalogService.items.forEach((catalogItem) => {
    const name = catalogItem.data?.metadata?.name;
    if (name) taskNames.add(name);
  });

  return Array.from(taskNames);
};

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done, thanks!

).catch(() =>
setFailedTasks([...failedTasks, item.data.task.name]),
);
// Checking if task with same name already exists, if yes then create with a different name to avoid conflict
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: These two blocks are pretty much the same, just using a different model. Lets rather create a callback which will take the model and its parameters

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done, thanks

@openshift-ci
Copy link
Contributor

openshift-ci bot commented Oct 31, 2025

@jhadvig: changing LGTM is restricted to collaborators

In response to this:

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository.

Copy link
Member

@vikram-raj vikram-raj left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

/lgtm

@openshift-ci
Copy link
Contributor

openshift-ci bot commented Nov 4, 2025

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: anwesha-palit-redhat, vikram-raj

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@openshift-ci openshift-ci bot added the approved label Nov 4, 2025
@vikram-raj vikram-raj merged commit dc8cbc4 into openshift-pipelines:main Nov 4, 2025
4 of 7 checks passed
@anwesha-palit-redhat
Copy link
Contributor Author

/cherry-pick release-v1.20.x

@anwesha-palit-redhat
Copy link
Contributor Author

/cherry-pick release-v1.19.x

@openshift-cherrypick-robot

@anwesha-palit-redhat: new pull request created: #805

In response to this:

/cherry-pick release-v1.20.x

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository.

@openshift-cherrypick-robot

@anwesha-palit-redhat: new pull request created: #806

In response to this:

/cherry-pick release-v1.19.x

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository.

@anwesha-palit-redhat
Copy link
Contributor Author

/cherry-pick release-v1.18.x

@anwesha-palit-redhat
Copy link
Contributor Author

/cherry-pick release-v1.17.x

@openshift-cherrypick-robot

@anwesha-palit-redhat: #678 failed to apply on top of branch "release-v1.18.x":

Applying: fix: updated task creation logic in pipeline builder to support installation of tasks with name name
Using index info to reconstruct a base tree...
M	src/components/catalog/apis/artifactHub.ts
M	src/components/pipeline-builder/PipelineBuilderForm.tsx
M	src/components/quick-search/utils/quick-search-utils.tsx
M	src/components/task-quicksearch/PipelineQuickSearch.tsx
Falling back to patching base and 3-way merge...
Auto-merging src/components/task-quicksearch/PipelineQuickSearch.tsx
Auto-merging src/components/quick-search/utils/quick-search-utils.tsx
CONFLICT (content): Merge conflict in src/components/quick-search/utils/quick-search-utils.tsx
Auto-merging src/components/pipeline-builder/PipelineBuilderForm.tsx
Auto-merging src/components/catalog/apis/artifactHub.ts
CONFLICT (content): Merge conflict in src/components/catalog/apis/artifactHub.ts
error: Failed to merge in the changes.
hint: Use 'git am --show-current-patch=diff' to see the failed patch
hint: When you have resolved this problem, run "git am --continue".
hint: If you prefer to skip this patch, run "git am --skip" instead.
hint: To restore the original branch and stop patching, run "git am --abort".
hint: Disable this message with "git config advice.mergeConflict false"
Patch failed at 0001 fix: updated task creation logic in pipeline builder to support installation of tasks with name name

In response to this:

/cherry-pick release-v1.18.x

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository.

@openshift-cherrypick-robot

@anwesha-palit-redhat: #678 failed to apply on top of branch "release-v1.17.x":

Applying: fix: updated task creation logic in pipeline builder to support installation of tasks with name name
Using index info to reconstruct a base tree...
M	src/components/catalog/apis/artifactHub.ts
M	src/components/pipeline-builder/PipelineBuilderForm.tsx
M	src/components/quick-search/utils/quick-search-utils.tsx
M	src/components/task-quicksearch/PipelineQuickSearch.tsx
Falling back to patching base and 3-way merge...
Auto-merging src/components/task-quicksearch/PipelineQuickSearch.tsx
Auto-merging src/components/quick-search/utils/quick-search-utils.tsx
CONFLICT (content): Merge conflict in src/components/quick-search/utils/quick-search-utils.tsx
Auto-merging src/components/pipeline-builder/PipelineBuilderForm.tsx
Auto-merging src/components/catalog/apis/artifactHub.ts
CONFLICT (content): Merge conflict in src/components/catalog/apis/artifactHub.ts
error: Failed to merge in the changes.
hint: Use 'git am --show-current-patch=diff' to see the failed patch
hint: When you have resolved this problem, run "git am --continue".
hint: If you prefer to skip this patch, run "git am --skip" instead.
hint: To restore the original branch and stop patching, run "git am --abort".
hint: Disable this message with "git config advice.mergeConflict false"
Patch failed at 0001 fix: updated task creation logic in pipeline builder to support installation of tasks with name name

In response to this:

/cherry-pick release-v1.17.x

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants