-
Notifications
You must be signed in to change notification settings - Fork 75
Populate third_party/intel/backend/include with sycl headers in setup.py
#3238
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Signed-off-by: dchigarev <[email protected]>
Signed-off-by: dchigarev <[email protected]>
Signed-off-by: dchigarev <[email protected]>
Signed-off-by: dchigarev <[email protected]>
Signed-off-by: dchigarev <[email protected]>
Signed-off-by: dchigarev <[email protected]>
Signed-off-by: dchigarev <[email protected]>
| run: | | ||
| cd python | ||
| pip install wheel pybind11 | ||
| pip install wheel pybind11 zstandard |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we'll agree to add zstandard as a new build dependency, shouldn't we create a single build_requirements.txt file that would contain all build dependencies and that will be used in all these pip install statements that I've changed in this PR?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it would be great to separate the build and the dependencies installation. Ideally, if the project could be built with cmake only, it would improve integration with IDEs.
Signed-off-by: dchigarev <[email protected]>
|
Hi @alexbaden, @etiotto, @vlad-penkin, I can review the implementation details, but I'm not sure we're heading in the direction you originally intended. Please confirm that this doesn't conflict with the plans. |
@anmyachev What are you doubts about? Do you think we shouldn't prepare headers at all (and rely that they already exist somewhere in the system) or we should prepare them, but avoid adding additional build-time dependencies (by either finding another download source or by placing the headers directly in the repo)? |
|
The original issue was closed #3147 |
Closes #3147
The main goal of this PR was to get rid of
icpxdependency while discovering sycl lib and sycl headers, and to also align this logic with AMD and Nvidia backends.The flow of discovering
libsycl.soindriver.pyis now the following:TRITON_LIBSYCL_PATHenv variableldconfig -pforlibsycl.soLD_LIBRARY_PATH(usually sycl would be found here if oneapi vars are set)$ONEAPI_ROOT/compiler/latest/libintel-sycl-rtpython package$CONDA_PREFIX/libpathlibsycl.sowasn't found in any of these locations, assume thaticpxwill be used that already knows the proper sycl location.The sycl headers are now supposed to always be located in
third_party/intel/backend/include/sycl/.... There's a new logic insetup.pythat searches for the sycl headers in the system and copies them to the backend folder. If headers were not found it downloadsintel-sycl-rtpackage from conda-forge and extracts sycl headers to the backend folder.There are a few problems with downloading headers from conda-forge:
setup.pytheintel-sycl-rtpackages also contains compiled libraries (and not only headers). Which makes the downloaded package to be ~6.5mb in size instead of light ~400kb for nvidia packages (we may put up with it though).tar.bz2that can be unpacked only using standard python libraries. Theintel-sycl-rtpackage is a.condapackage, that is azstpacked archive. This type of archive cannot be unpacked using standard python libraries and requireszstandardpython package to be installed. I've added this requirement to thescripts/compile-triton.shscript, but I'm not sure if it's okay to add one more dependency forsetup.pyAlternatives to solve the 2nd problem are:
setup.py. The downside of this approach is that we will need to manually reupload new headers once the new version comes up.third_party/intel/backend/include/sycland rely on that, that they exist somewhere in the system and are easy to find.