Skip to content

Commit 3c6b65b

Browse files
committed
fix: Remove vllm image cache
--bug=1052365 --user=刘瑞斌 【github#2353】vllm视觉模型修改最大tokens不生效 https://www.tapd.cn/57709429/s/1657667
1 parent fa1886a commit 3c6b65b

File tree

1 file changed

+3
-0
lines changed
  • apps/setting/models_provider/impl/vllm_model_provider/model

1 file changed

+3
-0
lines changed

apps/setting/models_provider/impl/vllm_model_provider/model/image.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,3 +18,6 @@ def new_instance(model_type, model_name, model_credential: Dict[str, object], **
1818
stream_usage=True,
1919
**optional_params,
2020
)
21+
22+
def is_cache_model(self):
23+
return False

0 commit comments

Comments
 (0)