Skip to content

Commit 8b994fb

Browse files
committed
remove link to standalone llms.py
1 parent 1097af9 commit 8b994fb

File tree

2 files changed

+1
-34
lines changed

2 files changed

+1
-34
lines changed

MyApp/_posts/2025-09-25_llms-py.md

Lines changed: 1 addition & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -35,34 +35,12 @@ preferred priority - whether optimizing for cost, performance or availability.
3535
- **No Setup**: Just download and use, configure preferred LLMs in [llms.json](https://github.com/ServiceStack/llms/blob/main/llms.json)
3636

3737

38-
## 📦 Installation Options
39-
40-
#### Option 1: PyPI Package
38+
## 📦 Install
4139

4240
:::sh
4341
pip install llms-py
4442
:::
4543

46-
#### Option 2: Direct Download
47-
48-
For standalone use, download [llms.py](https://github.com/ServiceStack/llms/blob/main/llms.py) and make it executable:
49-
50-
```bash
51-
curl -O https://raw.githubusercontent.com/ServiceStack/llms/main/llms.py
52-
chmod +x llms.py
53-
mv llms.py ~/.local/bin/llms
54-
```
55-
56-
Then install its only dependency:
57-
58-
:::sh
59-
pip install aiohttp
60-
:::
61-
62-
#### Ideal for usage with ComfyUI Custom Nodes
63-
64-
Simply drop [llms.py](https://github.com/ServiceStack/llms/blob/main/llms.py) into your ComfyUI custom nodes directory - no additional dependencies required!
65-
6644
## 🔧 Quick Start
6745

6846
```bash

MyApp/_posts/2025-10-01_llms-py-ui.md

Lines changed: 0 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -19,8 +19,6 @@ in Browsers to avoid needing any npm dependencies or build tools.
1919

2020
## Install
2121

22-
To get both llms.py and its UI it's recommended to install from PyPI:
23-
2422
:::sh
2523
pip install llms-py
2624
:::
@@ -33,15 +31,6 @@ llms --serve 8000
3331

3432
To launch the UI at `http://localhost:8000` and an OpenAI Endpoint at `http://localhost:8000/v1/chat/completions`.
3533

36-
If no UI is needed, download just [llms.py](https://github.com/ServiceStack/llms/blob/main/llms.py)
37-
for all other client & server features:
38-
39-
```bash
40-
curl -O https://raw.githubusercontent.com/ServiceStack/llms/main/llms.py
41-
chmod +x llms.py
42-
mv llms.py ~/.local/bin/llms
43-
```
44-
4534
## Simple and Flexible UI
4635

4736
This starts the Chat UI from where you can interact with any of your configured OpenAI-compatible Chat

0 commit comments

Comments
 (0)