Skip to content

Conversation

@mshahneo
Copy link
Contributor

Reverts #166204

There was a build issue due to a missing dependency.

@llvmbot
Copy link
Member

llvmbot commented Nov 25, 2025

@llvm/pr-subscribers-mlir

@llvm/pr-subscribers-mlir-gpu

Author: Md Abdullah Shahneous Bari (mshahneo)

Changes

Reverts llvm/llvm-project#166204

There was a build issue due to a missing dependency.


Full diff: https://github.com/llvm/llvm-project/pull/169570.diff

1 Files Affected:

  • (modified) mlir/lib/Dialect/GPU/Pipelines/GPUToXeVMPipeline.cpp (-3)
diff --git a/mlir/lib/Dialect/GPU/Pipelines/GPUToXeVMPipeline.cpp b/mlir/lib/Dialect/GPU/Pipelines/GPUToXeVMPipeline.cpp
index 38313dc3c01d5..b097d3a0c9686 100644
--- a/mlir/lib/Dialect/GPU/Pipelines/GPUToXeVMPipeline.cpp
+++ b/mlir/lib/Dialect/GPU/Pipelines/GPUToXeVMPipeline.cpp
@@ -111,11 +111,8 @@ void buildPostGPUCommonPassPipeline(
     pm.addPass(createGpuToLLVMConversionPass(gpuToLLVMOptions));
   }
   pm.addPass(createLowerAffinePass());
-  pm.addPass(createConvertVectorToLLVMPass());
   pm.addPass(createConvertToLLVMPass());
   pm.addPass(createReconcileUnrealizedCastsPass());
-  pm.addNestedPass<gpu::GPUModuleOp>(createCanonicalizerPass());
-  pm.addNestedPass<gpu::GPUModuleOp>(createCSEPass());
   // gpu-module-to-binary
   {
     GpuModuleToBinaryPassOptions gpuToModuleBinOptions;

@github-actions
Copy link

⚠️ We detected that you are using a GitHub private e-mail address to contribute to the repo.
Please turn off Keep my email addresses private setting in your account.
See LLVM Developer Policy and LLVM Discourse for more information.

@mshahneo mshahneo requested a review from silee2 November 25, 2025 21:34
@mshahneo mshahneo merged commit 9bf78ab into main Nov 25, 2025
11 of 12 checks passed
@mshahneo mshahneo deleted the revert-166204-issue_1337 branch November 25, 2025 21:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants