Improve ABI compatibility for KTransformersOps extension #1358
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR fixes ABI compatibility issues that caused
undefined symbol
errors when loadingktransformers
.Previously,
KTransformersOps
was compiled without explicitly setting the C++ ABI flag. This meant it might use a different ABI than installed PyTorch, leading to runtime errors likeKTransformersOps.cpython-311-x86_64-linux-gnu.so: undefined symbol: _ZN3c106detail23torchInternalAssertFailEPKcS2_jS2_RKSs
.This change ensures that
KTransformersOps
is now compiled with the correct C++ ABI flag (-D_GLIBCXX_USE_CXX11_ABI=0
or1
) that matches PyTorch installation.The core change involves updating
setup.py
to correctly apply the ABI flag tonvcc
andmcc
compilation steps, aligning with PyTorch's requirements.The function
get_compile_abi_args
has been renamed toget_cmake_abi_args
. While the original name might suggest it's only for CMake arguments, this function is now used to generate ABI-specific compiler flags for both CMake-based extensions and directCUDAExtension
/MUSAExtension
compilations.