Skip to content

Conversation

axel7083
Copy link
Contributor

@axel7083 axel7083 commented Apr 18, 2025

What does this PR do?

Screenshot / video of UI

What issues does this PR fix or reference?

Fixes #2886

How to test this PR?

Add the following model to the catalog

    {
      "id": "Qwen/Qwen2-VL-2B-Instruct",
      "name": "Qwen/Qwen2-VL-2B-Instruct",
      "description": "Qwen/Qwen2-VL-2B-Instruct",
      "url": "huggingface:/Qwen/Qwen2-VL-2B-Instruct",
      "backend": "vllm"
    }

@jeffmaury
Copy link
Collaborator

axel7083 and others added 2 commits May 15, 2025 14:38
@jeffmaury jeffmaury force-pushed the feature/inference/vllm branch from 1223bd9 to f101d13 Compare May 15, 2025 14:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Inference provider for vllm
2 participants