Skip to content

Commit fc83e00

Browse files
zufangzhuLu Teng
authored andcommitted
[Doc] refine readme (#57)
1 parent 55d5858 commit fc83e00

File tree

1 file changed

+80
-42
lines changed

1 file changed

+80
-42
lines changed

README.md

Lines changed: 80 additions & 42 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,9 @@
1+
# Intel® Extension for OpenXLA*
2+
3+
[![Python](https://img.shields.io/pypi/pyversions/intel_extension_for_openxla)](https://badge.fury.io/py/intel-extension-for-openxla)
4+
[![PyPI version](https://badge.fury.io/py/intel-extension-for-openxla.svg)](https://badge.fury.io/py/intel-extension-for-openxla)
5+
[![version](https://img.shields.io/github/v/release/intel/intel-extension-for-openxla?color=brightgreen&include_prereleases)](https://github.com/intel/intel-extension-for-openxla/releases)
6+
[![license](https://img.shields.io/badge/license-Apache%202-blue)](LICENSE.txt)
17

28
The [OpenXLA](https://github.com/openxla/xla) Project brings together a community of developers and leading AI/ML teams to accelerate ML and address infrastructure fragmentation across ML frameworks and hardware.
39

@@ -6,65 +12,93 @@ Intel® Extension for OpenXLA includes PJRT plugin implementation, which seamles
612
This guide introduces the overview of OpenXLA high level integration structure and demonstrates how to build Intel® Extension for OpenXLA and run JAX example with OpenXLA on Intel GPU. JAX is the first supported front-end.
713

814
## 1. Overview
15+
916
<p align="center">
1017
<img src="openxla_for_intel_gpu.jpg" width="50%">
1118
</p>
1219

1320
* [JAX](https://jax.readthedocs.io/en/latest/) provides a familiar NumPy-style API, includes composable function transformations for compilation, batching, automatic differentiation, and parallelization, and the same code executes on multiple backends.
1421
* TensorFlow and PyTorch support is on the way.
1522

16-
## 2. Hardware and Software Requirement
23+
## 2. Requirements
1724

1825
### Hardware Requirements
1926

2027
Verified Hardware Platforms:
21-
- Intel® Data Center GPU Max Series, Driver Version: [682](https://dgpu-docs.intel.com/releases/production_682.14_20230804.html)
22-
- Intel® Data Center GPU Flex Series 170, Driver Version: [682](https://dgpu-docs.intel.com/releases/production_682.14_20230804.html)
28+
29+
* Intel® Data Center GPU Max Series, Driver Version: [682](https://dgpu-docs.intel.com/releases/production_682.14_20230804.html)
30+
31+
* Intel® Data Center GPU Flex Series 170, Driver Version: [682](https://dgpu-docs.intel.com/releases/production_682.14_20230804.html)
2332

2433
### Software Requirements
25-
- Ubuntu 22.04, Red Hat 8.6/8.8/9.2 (64-bit)
26-
- Intel® Data Center GPU Flex Series
27-
- Ubuntu 22.04, Red Hat 8.6/8.8/9.2 (64-bit), SUSE Linux Enterprise Server(SLES) 15 SP4
28-
- Intel® Data Center GPU Max Series
29-
- Intel® oneAPI Base Toolkit 2023.1
30-
- Jax/Jaxlib 0.4.13
31-
- Python 3.9-3.11
32-
- pip 19.0 or later (requires manylinux2014 support)
3334

35+
* Ubuntu 22.04, Red Hat 8.6/8.8/9.2 (64-bit)
36+
* Intel® Data Center GPU Flex Series
37+
* Ubuntu 22.04, Red Hat 8.6/8.8/9.2 (64-bit), SUSE Linux Enterprise Server(SLES) 15 SP4
38+
* Intel® Data Center GPU Max Series
39+
* Intel® oneAPI Base Toolkit 2023.1
40+
* Jax/Jaxlib 0.4.13
41+
* Python 3.9-3.11
42+
* pip 19.0 or later (requires manylinux2014 support)
3443

35-
### Install GPU Drivers
44+
### Install Intel GPU Drivers
3645

3746
|OS|Intel GPU|Install Intel GPU Driver|
3847
|-|-|-|
3948
|Ubuntu 22.04, Red Hat 8.6/8.8/9.2|Intel® Data Center GPU Flex Series| Refer to the [Installation Guides](https://dgpu-docs.intel.com/installation-guides/index.html#intel-data-center-gpu-flex-series) for latest driver installation. If install the verified Intel® Data Center GPU Max Series/Intel® Data Center GPU Flex Series [682](https://dgpu-docs.intel.com/releases/production_682.14_20230804.html), please append the specific version after components, such as `sudo apt-get install intel-opencl-icd==23.22.26516.25-682~22.04`|
4049
|Ubuntu 22.04, Red Hat 8.6/8.8/9.2, SLES 15 SP4|Intel® Data Center GPU Max Series| Refer to the [Installation Guides](https://dgpu-docs.intel.com/installation-guides/index.html#intel-data-center-gpu-max-series) for latest driver installation. If install the verified Intel® Data Center GPU Max Series/Intel® Data Center GPU Flex Series [682](https://dgpu-docs.intel.com/releases/production_682.14_20230804.html), please append the specific version after components, such as `sudo apt-get install intel-opencl-icd==23.22.26516.25-682~22.04`|
4150

42-
## Build and Install
51+
### Install oneAPI Base Toolkit Packages
52+
53+
Need to install components of Intel® oneAPI Base Toolkit:
54+
55+
* Intel® oneAPI DPC++ Compiler
56+
* Intel® oneAPI Math Kernel Library (oneMKL)
57+
* Intel® oneAPI Threading Building Blocks (TBB), dependency of DPC++ Compiler.
58+
4359
```bash
44-
# Source OneAPI env
45-
$ source /opt/intel/oneapi/compiler/2023.1.0/env/vars.sh
46-
$ source /opt/intel/oneapi/mkl/2023.1.0/env/vars.sh
47-
$ source /opt/intel/oneapi/tbb/2023.1.0/env/vars.sh
48-
49-
$ git clone https://github.com/intel/intel-extension-for-openxla.git
50-
$ pip install jax==0.4.13 jaxlib==0.4.13
51-
$ ./configure # Choose Yes for all.
52-
$ bazel build //xla/tools/pip_package:build_pip_package
53-
$ ./bazel-bin/xla/tools/pip_package/build_pip_package ./
54-
$ pip install intel_extension_for_openxla-0.1.0-cp39-cp39-linux_x86_64.whl
55-
```
56-
This repo pulls public openxla code as its third_party. For development, one often wants to make changes to the XLA repository as well. You can override the pinned xla repo with a local checkout by:
60+
wget https://registrationcenter-download.intel.com/akdlm/IRC_NAS/7deeaac4-f605-4bcf-a81b-ea7531577c61/l_BaseKit_p_2023.1.0.46401_offline.sh
61+
sudo sh ./l_BaseKit_p_2023.1.0.46401_offline.sh
62+
63+
# Source OneAPI env
64+
source /opt/intel/oneapi/compiler/2023.1.0/env/vars.sh
65+
source /opt/intel/oneapi/mkl/2023.1.0/env/vars.sh
66+
source /opt/intel/oneapi/tbb/2021.9.0/env/vars.sh
5767
```
58-
bazel build --override_repository=xla=/path/to/xla //xla/tools/pip_package:build_pip_package
68+
69+
## 3. Install
70+
71+
### Install via PyPI wheel
72+
73+
```bash
74+
pip install --upgrade intel-extension-for-openxla
5975
```
6076

61-
**Notes**:
62-
* Besides python whl, we can also build .so `bazel build //xla:pjrt_plugin_xpu.so` and run with ENV `PJRT_NAMES_AND_LIBRARY_PATHS='xpu:Your_openxla_path/bazel-bin/xla/pjrt_plugin_xpu.so'`
77+
### Install from Source Build
78+
79+
```bash
80+
git clone https://github.com/intel/intel-extension-for-openxla.git
81+
pip install jax==0.4.13 jaxlib==0.4.13
82+
./configure # Choose Yes for all.
83+
bazel build //xla/tools/pip_package:build_pip_package
84+
./bazel-bin/xla/tools/pip_package/build_pip_package ./
85+
pip install intel_extension_for_openxla-0.1.0-cp39-cp39-linux_x86_64.whl
86+
```
87+
88+
**Aditional Build Option**:
89+
90+
This repo pulls public XLA code as its third party build dependency. As an openxla developer, you may need to modify and override this specific XLA repo with a local checkout version by the following command:
91+
92+
```bash
93+
bazel build --override_repository=xla=/path/to/xla //xla/tools/pip_package:build_pip_package
94+
```
6395

6496
## 4. Run JAX Example
6597

66-
* **Run the below jax python code.**
98+
### Run the below jax python code
99+
67100
When running jax code, `jax.local_devices()` can check which device is running.
101+
68102
```python
69103
import jax
70104
import jax.numpy as jnp
@@ -84,8 +118,10 @@ def lax_conv():
84118

85119
print(lax_conv())
86120
```
87-
* **Reference result:**
88-
```
121+
122+
### Reference result
123+
124+
```bash
89125
jax.local_devices(): [xpu(id=0), xpu(id=1)]
90126
[[[[2.0449753 2.093208 2.1844783 1.9769732 1.5857391 1.6942389]
91127
[1.9218378 2.2862523 2.1549542 1.8367321 1.3978379 1.3860377]
@@ -103,13 +139,15 @@ jax.local_devices(): [xpu(id=0), xpu(id=1)]
103139

104140
## 5. FAQ
105141

106-
* **Q**: There is an error 'No visible XPU devices'.
107-
**A**:
108-
Print `jax.local_devices()` to check which device is running. Set `export OCL_ICD_ENABLE_TRACE=1` to check if there are driver error messages. The following code opens more debug log for JAX app.
109-
```python
110-
import logging
111-
logging.basicConfig(level = logging.DEBUG)
112-
```
113-
* **Q**: There is an error 'version GLIBCXX_3.4.30' not found.
114-
**A**: please upgrade libstdc++ to the latest, for example for conda
115-
```$ conda install libstdcxx-ng==12.2.0 -c conda-forge```
142+
1. If there is an error 'No visible XPU devices', print `jax.local_devices()` to check which device is running. Set `export OCL_ICD_ENABLE_TRACE=1` to check if there are driver error messages. The following code opens more debug log for JAX app.
143+
144+
```python
145+
import logging
146+
logging.basicConfig(level = logging.DEBUG)
147+
```
148+
149+
2. If there is an error 'version GLIBCXX_3.4.30' not found, upgrade libstdc++ to the latest, for example for conda
150+
151+
```bash
152+
conda install libstdcxx-ng==12.2.0 -c conda-forge
153+
```

0 commit comments

Comments
 (0)