Skip to content

testinprod-io/panda-vllm-proxy

Repository files navigation

VLLM Proxy

A proxy for vLLM.

Development Setup

1. Create and activate virtual environment

# Create virtual environment using specific Python version
poetry env use 3.11.12

# Activate virtual environment
source .venv/bin/activate

2. Install dependencies

# Install development dependencies
poetry install

3. Run for local development

# Run local mock vllm
cd docker/local
docker compose -f docker-compose.local.yml up -d

# Run vllm-proxy server locally
cd ../..
uvicorn src.app.main:app --host 0.0.0.0 --reload

Tests

# Run all tests
pytest tests

# Run specific test file
pytest tests/app/test_openai.py

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •