Skip to content

Commit 9020421

Browse files
authored
Merge branch 'main' into feat/optional_deps
2 parents ca77be5 + f7731ec commit 9020421

File tree

3 files changed

+20
-2
lines changed

3 files changed

+20
-2
lines changed

docs/backends.md

Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -40,6 +40,24 @@ docker run --gpus 1 -ti --shm-size 1g --ipc=host --rm -p 8080:80 \
4040

4141
For more information on starting a TGI server, see the [TGI Documentation](https://huggingface.co/docs/text-generation-inference/index).
4242

43+
### 3. llama.cpp
44+
45+
[llama.cpp](https://github.com/ggml-org/llama.cpp) provides lightweight, OpenAI-compatible server through its [llama-server](https://github.com/ggml-org/llama.cpp/blob/master/tools/server) tool.
46+
47+
To start a llama.cpp server with the gpt-oss-20b model, you can use the following command:
48+
49+
```bash
50+
llama-server -hf ggml-org/gpt-oss-20b-GGUF --alias gpt-oss-20b --ctx-size 0 --jinja -ub 2048 -b 2048
51+
```
52+
53+
Note that we are providing an alias `gpt-oss-20b` for the model name because `guidellm` is using it to retrieve model metadata in JSON format and such metadata is not included in GGUF model repositories. A simple workaround is to download the metadata files from safetensors repository and place them in a local directory named after the alias:
54+
55+
```bash
56+
huggingface-cli download openai/gpt-oss-20b --include "*.json" --local-dir gpt-oss-20b/
57+
```
58+
59+
Now you can run `guidellm` as usual and it will be able to fetch the model metadata from the local directory.
60+
4361
## Expanding Backend Support
4462

4563
GuideLLM is an open platform, and we encourage contributions to extend its backend support. Whether it's adding new server implementations, integrating with Python-based backends, or enhancing existing capabilities, your contributions are welcome. For more details on how to contribute, see the [CONTRIBUTING.md](https://github.com/vllm-project/guidellm/blob/main/CONTRIBUTING.md) file.

src/guidellm/backend/openai.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -688,7 +688,7 @@ def _extract_completions_delta_content(
688688
return data["choices"][0]["text"]
689689

690690
if type_ == "chat_completions":
691-
return data["choices"][0]["delta"]["content"]
691+
return data.get("choices", [{}])[0].get("delta", {}).get("content")
692692

693693
raise ValueError(f"Unsupported type: {type_}")
694694

src/guidellm/presentation/data_models.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -190,7 +190,7 @@ class TabularDistributionSummary(DistributionSummary):
190190
"""
191191

192192
@computed_field
193-
def percentile_rows(self) -> list[dict[str, float]]:
193+
def percentile_rows(self) -> list[dict[str, Union[str, float]]]:
194194
rows = [
195195
{"percentile": name, "value": value}
196196
for name, value in self.percentiles.model_dump().items()

0 commit comments

Comments
 (0)