Skip to content

drm-gith/vllm-openai-runpod

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

vLLM OpenAI-Compatible for RunPod

Este repositório permite criar um endpoint serverless no RunPod com o vLLM rodando no estilo OpenAI API.

Como usar

  1. Clone este repo no RunPod (GitHub Repo).
  2. Defina estas env vars:
    • MODEL_NAME=TheBloke/Meditron-70B-AWQ
    • HUGGING_FACE_HUB_TOKEN=hf_XXXXXXXXXXXXXX
  3. Usar GPU de 48 GB ou 80 GB.
  4. Acesse o endpoint /v1/chat/completions!

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published