Skip to content

Commit 82d5458

Browse files
authored
[doc]: Update installation doc and readme (#1465)
1 parent df306f6 commit 82d5458

File tree

2 files changed

+4
-98
lines changed

2 files changed

+4
-98
lines changed

README.md

Lines changed: 1 addition & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -45,21 +45,7 @@ Using our PyTorch API is the easiest way to get started:
4545

4646
### Install from PIP
4747

48-
We provide prebuilt python wheels for Linux. Install FlashInfer with the following command:
49-
50-
```bash
51-
# For CUDA 12.6 & torch 2.6
52-
pip install flashinfer-python -i https://flashinfer.ai/whl/cu126/torch2.6
53-
# For other CUDA & torch versions, check https://docs.flashinfer.ai/installation.html
54-
```
55-
56-
To try the latest features from the main branch, use our nightly-built wheels:
57-
58-
```bash
59-
pip install flashinfer-python -i https://flashinfer.ai/whl/nightly/cu126/torch2.6
60-
```
61-
62-
For a JIT version (compiling every kernel from scratch, [NVCC](https://developer.nvidia.com/cuda-downloads) is required), install from [PyPI](https://pypi.org/project/flashinfer-python/):
48+
FlashInfer is available as a Python package for Linux on PyPI. You can install it with the following command:
6349

6450
```bash
6551
pip install flashinfer-python

docs/installation.rst

Lines changed: 3 additions & 83 deletions
Original file line numberDiff line numberDiff line change
@@ -15,94 +15,14 @@ Prerequisites
1515

1616
- Python: 3.8, 3.9, 3.10, 3.11, 3.12
1717

18-
- PyTorch: 2.2/2.3/2.4/2.5 with CUDA 11.8/12.1/12.4 (only for torch 2.4 or later)
19-
20-
- Use ``python -c "import torch; print(torch.version.cuda)"`` to check your PyTorch CUDA version.
21-
22-
- Supported GPU architectures: ``sm75``, ``sm80``, ``sm86``, ``sm89``, ``sm90``.
23-
2418
Quick Start
2519
^^^^^^^^^^^
2620

27-
The easiest way to install FlashInfer is via pip, we host wheels with indexed URL for different PyTorch versions and CUDA versions. Please note that the package currently used by FlashInfer is named ``flashinfer-python``, not ``flashinfer``.
28-
29-
.. tabs::
30-
.. tab:: PyTorch 2.6
31-
32-
.. tabs::
33-
34-
.. tab:: CUDA 12.6
35-
36-
.. code-block:: bash
37-
38-
pip install flashinfer-python -i https://flashinfer.ai/whl/cu126/torch2.6/
39-
40-
.. tab:: CUDA 12.4
41-
42-
.. code-block:: bash
43-
44-
pip install flashinfer-python -i https://flashinfer.ai/whl/cu124/torch2.6/
45-
46-
.. tab:: PyTorch 2.5
47-
48-
.. tabs::
49-
50-
.. tab:: CUDA 12.4
51-
52-
.. code-block:: bash
53-
54-
pip install flashinfer-python -i https://flashinfer.ai/whl/cu124/torch2.5/
55-
56-
.. tab:: CUDA 12.1
57-
58-
.. code-block:: bash
59-
60-
pip install flashinfer-python -i https://flashinfer.ai/whl/cu121/torch2.5/
61-
62-
.. tab:: CUDA 11.8
63-
64-
.. code-block:: bash
65-
66-
pip install flashinfer-python -i https://flashinfer.ai/whl/cu118/torch2.5/
67-
68-
.. tab:: PyTorch 2.4
69-
70-
.. tabs::
71-
72-
.. tab:: CUDA 12.4
73-
74-
.. code-block:: bash
75-
76-
pip install flashinfer-python -i https://flashinfer.ai/whl/cu124/torch2.4/
77-
78-
79-
.. tab:: CUDA 12.1
80-
81-
.. code-block:: bash
82-
83-
pip install flashinfer-python -i https://flashinfer.ai/whl/cu121/torch2.4/
84-
85-
.. tab:: CUDA 11.8
86-
87-
.. code-block:: bash
88-
89-
pip install flashinfer-python -i https://flashinfer.ai/whl/cu118/torch2.4/
90-
91-
.. tab:: PyTorch 2.3
92-
93-
.. tabs::
94-
95-
.. tab:: CUDA 12.1
96-
97-
.. code-block:: bash
98-
99-
pip install flashinfer-python -i https://flashinfer.ai/whl/cu121/torch2.3/
100-
101-
.. tab:: CUDA 11.8
21+
The easiest way to install FlashInfer is via pip. Please note that the package currently used by FlashInfer is named ``flashinfer-python``, not ``flashinfer``.
10222

103-
.. code-block:: bash
23+
.. code-block:: bash
10424
105-
pip install flashinfer-python -i https://flashinfer.ai/whl/cu118/torch2.3/
25+
pip install flashinfer-python
10626
10727
10828
.. _install-from-source:

0 commit comments

Comments
 (0)