-
Notifications
You must be signed in to change notification settings - Fork 10k
Open
Description
🔎 Search before asking
- I have searched the PaddleOCR Docs and found no similar bug report.
- I have searched the PaddleOCR Issues and found no similar bug report.
- I have searched the PaddleOCR Discussions and found no similar bug report.
🐛 Bug (问题描述)
官方文本显示支持llama-cpp-server,但是python调用报错:Invalid backend for the VL recognition module: llama-cpp-server. Supported values are ['native', 'vllm-server', 'sglang-server', 'fastdeploy-server', 'mlx-vlm-server'].
🏃♂️ Environment (运行环境)
Name: paddleocr
Version: 3.4.0
🌰 Minimal Reproducible Example (最小可复现问题的Demo)
pipeline = PaddleOCRVL(
vl_rec_backend="llama-cpp-server",
vl_rec_server_url="http://127.0.0.1:8111/v1",
vl_rec_api_model_name="PaddlePaddle/PaddleOCR-VL-1.5-GGUF",
)
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels