Skip to content

Commit 096827c

Browse files
authored
[Docs] Add notes on ROCm-supported models (#2087)
1 parent 6565d9e commit 096827c

File tree

1 file changed

+10
-3
lines changed

1 file changed

+10
-3
lines changed

docs/source/models/supported_models.rst

Lines changed: 10 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -73,6 +73,10 @@ If your model uses one of the above model architectures, you can seamlessly run
7373
Otherwise, please refer to :ref:`Adding a New Model <adding_a_new_model>` for instructions on how to implement support for your model.
7474
Alternatively, you can raise an issue on our `GitHub <https://github.com/vllm-project/vllm/issues>`_ project.
7575

76+
.. note::
77+
Currently, the ROCm version of vLLM does not support Mixtral.
78+
Additionally, it only supports Mistral for context lengths up to 4096.
79+
7680
.. tip::
7781
The easiest way to check if your model is supported is to run the program below:
7882

@@ -84,18 +88,21 @@ Alternatively, you can raise an issue on our `GitHub <https://github.com/vllm-pr
8488
output = llm.generate("Hello, my name is")
8589
print(output)
8690
87-
To use model from www.modelscope.cn
91+
If vLLM successfully generates text, it indicates that your model is supported.
92+
93+
.. tip::
94+
To use models from `ModelScope <www.modelscope.cn>`_ instead of HuggingFace Hub, set an environment variable:
8895

8996
.. code-block:: shell
9097
9198
$ export VLLM_USE_MODELSCOPE=True
9299
100+
And use with :code:`trust_remote_code=True`.
101+
93102
.. code-block:: python
94103
95104
from vllm import LLM
96105
97106
llm = LLM(model=..., revision=..., trust_remote_code=True) # Name or path of your model
98107
output = llm.generate("Hello, my name is")
99108
print(output)
100-
101-
If vLLM successfully generates text, it indicates that your model is supported.

0 commit comments

Comments
 (0)