Skip to content

Commit dbf1348

Browse files
authored
[Misc] Fix doctest (#7648)
### What this PR does / why we need it? This patch fix the doc test [failure](https://github.com/vllm-project/vllm-ascend/actions/runs/23501203166/job/68396531067): 1. Enforce the versions of torchaudio and torchvision to match the version of torch_npu(refer https://github.com/pytorch/pytorch/wiki/PyTorch-Versions) 2. Some convenience fixes to prevent network fluctuations from causing test failures(eg: offline mode for ms) ### Does this PR introduce _any_ user-facing change? ### How was this patch tested? - vLLM version: v0.18.0 - vLLM main: vllm-project/vllm@35141a7 --------- Signed-off-by: wangli <wangli858794774@gmail.com>
1 parent 5a6a4d1 commit dbf1348

File tree

3 files changed

+8
-4
lines changed

3 files changed

+8
-4
lines changed

requirements.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -14,8 +14,8 @@ psutil
1414
setuptools>=64
1515
setuptools-scm>=8
1616
torch==2.9.0
17-
torchvision
18-
torchaudio
17+
torchvision==0.24.0
18+
torchaudio==2.9.0
1919
wheel
2020
xgrammar>=0.1.30
2121
pandas-stubs

tests/e2e/doctests/002-pip-binary-installation-test.sh

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -36,21 +36,24 @@ function config_pip_mirror() {
3636
function install_binary_test() {
3737

3838
install_system_packages
39-
config_pip_mirror
4039
create_vllm_venv
40+
config_pip_mirror
4141

4242
PIP_VLLM_VERSION=$(get_version pip_vllm_version)
4343
VLLM_VERSION=$(get_version vllm_version)
4444
PIP_VLLM_ASCEND_VERSION=$(get_version pip_vllm_ascend_version)
4545
_info "====> Install vllm==${PIP_VLLM_VERSION} and vllm-ascend ${PIP_VLLM_ASCEND_VERSION}"
4646

4747
# Setup extra-index-url for x86 & torch_npu dev version
48-
pip config set global.extra-index-url "https://download.pytorch.org/whl/cpu/ https://mirrors.huaweicloud.com/ascend/repos/pypi"
48+
pip config set global.extra-index-url "https://download.pytorch.org/whl/cpu/"
4949

5050
# The vLLM version already in pypi, we install from pypi.
5151
pip install vllm=="${PIP_VLLM_VERSION}"
5252

5353
pip install vllm-ascend=="${PIP_VLLM_ASCEND_VERSION}"
54+
if [ "${PIP_VLLM_ASCEND_VERSION}" == "0.17.0rc1" ]; then
55+
pip install torchvision==0.24.0 torchaudio==2.9.0
56+
fi
5457

5558
pip list | grep vllm
5659

tests/e2e/run_doctests.sh

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,7 @@ set -eo errexit
2323

2424
export VLLM_USE_MODELSCOPE=true
2525
export MODELSCOPE_HUB_FILE_LOCK=false
26+
export HF_HUB_OFFLINE=1
2627

2728
_info "====> Start Quickstart test"
2829
. "${SCRIPT_DIR}/doctests/001-quickstart-test.sh"

0 commit comments

Comments
 (0)