You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: CONTRIBUTING.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,7 +2,7 @@
2
2
3
3
### License
4
4
5
-
<PROJECTNAME> is licensed under the terms in [LICENSE]<linktolicensefileinrepo>. By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.
5
+
Intel® Extension for OpenXLA is licensed under the terms in [LICENSE](LICENSE.txt). By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.
Copy file name to clipboardExpand all lines: README.md
+20-30Lines changed: 20 additions & 30 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@
7
7
8
8
The [OpenXLA](https://github.com/openxla/xla) Project brings together a community of developers and leading AI/ML teams to accelerate ML and address infrastructure fragmentation across ML frameworks and hardware.
9
9
10
-
Intel® Extension for OpenXLA includes PJRT plugin implementation, which seamlessly runs JAX models on Intel GPU. The PJRT API simplified the integration, which allowed the Intel GPU plugin to be developed separately and quickly integrated into JAX. This same PJRT implementation also enables initial Intel GPU support for TensorFlow and PyTorch models with XLA acceleration. Refer to [OpenXLA PJRT Plugin RFC](https://github.com/openxla/community/blob/main/rfcs/20230123-pjrt-plugin.md) for more details.
10
+
Intel® Extension for OpenXLA includes PJRT plugin implementation, which seamlessly runs JAX models on Intel GPU. The PJRT API simplifies the integration, enabling the Intel GPU plugin to be developed separately and allowing smooth integration with JAX. This same PJRT implementation also enables initial Intel GPU support for TensorFlow and PyTorch models with XLA acceleration. Refer to [OpenXLA PJRT Plugin RFC](https://github.com/openxla/community/blob/main/rfcs/20230123-pjrt-plugin.md) for more details.
11
11
12
12
This guide introduces the overview of OpenXLA high level integration structure and demonstrates how to build Intel® Extension for OpenXLA and run JAX example with OpenXLA on Intel GPU. JAX is the first supported front-end.
13
13
@@ -26,59 +26,48 @@ This guide introduces the overview of OpenXLA high level integration structure a
26
26
27
27
Verified Hardware Platforms:
28
28
29
-
* Intel® Data Center GPU Max Series, Driver Version: [LTS release 2350.125](https://dgpu-docs.intel.com/releases/LTS-release-notes.html)
29
+
* Intel® Data Center GPU Max Series, Driver Version: [LTS release 2350.136](https://dgpu-docs.intel.com/releases/LTS-release-notes.html)
30
30
31
-
* Intel® Data Center GPU Flex Series, Driver Version: [LTS release 2350.125](https://dgpu-docs.intel.com/driver/installation.html)
31
+
* Intel® Data Center GPU Flex Series, Driver Version: [LTS release 2350.136](https://dgpu-docs.intel.com/driver/installation.html)
32
32
33
33
### Software Requirements
34
34
35
35
* Ubuntu 22.04 (64-bit)
36
36
* Intel® Data Center GPU Flex Series
37
37
* Ubuntu 22.04, SUSE Linux Enterprise Server(SLES) 15 SP4
38
38
* Intel® Data Center GPU Max Series
39
-
*[Intel® oneAPI Base Toolkit 2025.0](https://www.intel.com/content/www/us/en/developer/articles/release-notes/intel-oneapi-toolkit-release-notes.html)
40
-
* Jax/Jaxlib 0.4.30
41
-
* Python 3.9-3.12
39
+
*[Intel® Deep Learning Essentials 2025.1](https://www.intel.com/content/www/us/en/developer/tools/oneapi/base-toolkit-download.html?packages=dl-essentials&dl-lin=offline&dl-essentials-os=linux)
40
+
* Jax/Jaxlib 0.4.38
41
+
* Python 3.9-3.13
42
42
* pip 19.0 or later (requires manylinux2014 support)
43
43
44
-
**NOTE: Since JAX has its own [platform limitation](https://jax.readthedocs.io/en/latest/installation.html#supported-platforms) (Ubuntu 20.04 or later), real software requirements is restricted when works with JAX.**
44
+
**NOTE: Since JAX has its own [platform limitation](https://jax.readthedocs.io/en/latest/installation.html#supported-platforms) (Ubuntu 20.04 or later), real software requirements are restricted when working with JAX.**
45
45
46
46
### Install Intel GPU Drivers
47
47
48
48
|OS|Intel GPU|Install Intel GPU Driver|
49
49
|-|-|-|
50
-
|Ubuntu 22.04 |Intel® Data Center GPU Flex Series| Refer to the [Installation Guides](https://dgpu-docs.intel.com/driver/installation.html) for latest driver installation. If install the verified Intel® Data Center GPU Max Series/Intel® Data Center GPU Flex Series [LTS release 2350.125](https://dgpu-docs.intel.com/driver/installation.html), please append the specific version after components, such as `sudo apt-get install intel-opencl-icd==24.45.31740.10-1057~22.04`|
51
-
|Ubuntu 22.04, SLES 15 SP4|Intel® Data Center GPU Max Series| Refer to the [Installation Guides](https://dgpu-docs.intel.com/driver/installation.html) for latest driver installation. If install the verified Intel® Data Center GPU Max Series/Intel® Data Center GPU Flex Series [LTS release 2350.125](https://dgpu-docs.intel.com/releases/LTS-release-notes.html), please append the specific version after components, such as `sudo apt-get install intel-opencl-icd==24.45.31740.10-1057~22.04`|
50
+
|Ubuntu 22.04 |Intel® Data Center GPU Flex Series| Refer to the [Installation Guides](https://dgpu-docs.intel.com/driver/installation.html) for latest driver installation. If installing the verified Intel® Data Center GPU Max Series/Intel® Data Center GPU Flex Series [LTS release 2350.136](https://dgpu-docs.intel.com/driver/installation.html), please append the specific version after components, such as `sudo apt-get install intel-opencl-icd==24.45.31740.10-1057~22.04`|
51
+
|Ubuntu 22.04, SLES 15 SP4|Intel® Data Center GPU Max Series| Refer to the [Installation Guides](https://dgpu-docs.intel.com/driver/installation.html) for latest driver installation. If installing the verified Intel® Data Center GPU Max Series/Intel® Data Center GPU Flex Series [LTS release 2350.136](https://dgpu-docs.intel.com/releases/LTS-release-notes.html), please append the specific version after components, such as `sudo apt-get install intel-opencl-icd==24.45.31740.10-1057~22.04`|
52
52
53
-
### Install oneAPI Base Toolkit Packages
53
+
### Install Intel® Deep Learning Essentials Packages
54
54
55
-
Need to install components of Intel® oneAPI Base Toolkit:
55
+
Need to install components of Intel® Deep Learning Essentials:
**Backup**: Recommend to rollback to **Toolkit 2024.1** if meet performance issue. See [Release Notes](https://github.com/intel/intel-extension-for-openxla/releases) for more details.
@@ -89,6 +78,7 @@ Please refer to [test/requirements.txt](test/requirements.txt) for the version d
89
78
The following table tracks intel-extension-for-openxla versions and compatible versions of `jax` and `jaxlib`. The compatibility between `jax` and `jaxlib` is maintained through JAX. This version restriction will be relaxed over time as the plugin API matures.
By default, bazel will automatically search for the required libraries on your system. This eliminates the need for manual configuration in most cases. For more advanced use cases, you can specify a custom location for the libraries using environment variables:
2. If there is an error 'version GLIBCXX_3.4.30'not found, upgrade libstdc++ to the latest, for example for conda
190
180
191
181
```bash
192
-
conda install libstdcxx-ng==12.2.0-c conda-forge
182
+
conda install libstdcxx-ng -c conda-forge-y
193
183
```
194
184
195
185
3. If there is an error '/usr/bin/ld: cannot find -lstdc++: No such file or directory' during source build under Ubuntu 22.04, check the selected GCC-toolchain path and the installed libstdc++.so library path, then create symbolic link of the selected GCC-toolchain path to the libstdc++.so path, for example:
Copy file name to clipboardExpand all lines: docs/acc_jax.md
+7-6Lines changed: 7 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,15 +1,16 @@
1
1
# Accelerated JAX on Intel GPU
2
2
3
3
## Intel® Extension for OpenXLA* plug-in
4
-
Intel® Extension for OpenXLA* includes PJRT plugin implementation, which seamlessly runs JAX models on Intel GPU. The PJRT API simplified the integration, which allowed the Intel GPU plugin to be developed separately and quickly integrated into JAX. Refer to [OpenXLA PJRT Plugin RFC](https://github.com/openxla/community/blob/main/rfcs/20230123-pjrt-plugin.md) for more details.
4
+
Intel® Extension for OpenXLA includes PJRT plugin implementation, which seamlessly runs JAX models on Intel GPU. The PJRT API simplifies the integration, enabling the Intel GPU plugin to be developed separately and allowing smooth integration with JAX. This same PJRT implementation also enables initial Intel GPU support for TensorFlow and PyTorch models with XLA acceleration. Refer to [OpenXLA PJRT Plugin RFC](https://github.com/openxla/community/blob/main/rfcs/20230123-pjrt-plugin.md) for more details.
5
5
6
6
## Requirements
7
-
Please check [README#requirements](../README.md#2-requirements) for the requirements of hardware and software.
7
+
Please check the [Requirements section](../README.md#2-requirements) for the hardware and software requirements.
8
8
9
9
## Install
10
10
The following table tracks intel-extension-for-openxla versions and compatible versions of `jax` and `jaxlib`. The compatibility between `jax` and `jaxlib` is maintained through JAX. This version restriction will be relaxed over time as the plugin API matures.
0 commit comments