Skip to content

Added optional LoRA adapter support for vLLM inference. #88

Added optional LoRA adapter support for vLLM inference.

Added optional LoRA adapter support for vLLM inference. #88

Triggered via pull request January 8, 2026 18:15
Status Success
Total duration 3m 1s
Artifacts 1

docs.yml

on: pull_request
Fit to window
Zoom out
Zoom in

Artifacts

Produced during runtime
Name Size Digest
docs-html
250 KB
sha256:2f4945f342fe23dfc3ea81bff0eec0010be21ac5713737b8b6597c96e517121d