Skip to content

Commit b3ee7ec

Browse files
committed
fix links
1 parent be63c19 commit b3ee7ec

File tree

2 files changed

+5
-5
lines changed

2 files changed

+5
-5
lines changed

MyApp/_posts/2025-09-25_llms-py.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -29,10 +29,10 @@ preferred priority - whether optimizing for cost, performance or availability.
2929

3030
### ⚡ Ultra-Lightweight Architecture
3131

32-
- **Single File**: Just one [llms.py](https://github.com/ServiceStack/llms/blob/main/llms.py) file (easily customizable)
32+
- **Single File**: Just one [llms.py](https://github.com/ServiceStack/llms/blob/main/llms/main.py) file (easily customizable)
3333
- **Single Dependency**: Single `aiohttp` dependency
3434
- **Zero Dependencies for ComfyUI**: Ideal for use in ComfyUI Custom Nodes
35-
- **No Setup**: Just download and use, configure preferred LLMs in [llms.json](https://github.com/ServiceStack/llms/blob/main/llms.json)
35+
- **No Setup**: Just download and use, configure preferred LLMs in [llms.json](https://github.com/ServiceStack/llms/blob/main/llms/llms.json)
3636

3737

3838
## 📦 Install
@@ -151,7 +151,7 @@ Use JSON config to add any OpenAI-compatible API endpoints and models
151151
- **Environment Variables**: Secure API key management
152152
- **Provider Management**: Easy enable/disable of providers
153153
- **Custom Models**: Define your own model aliases and mappings
154-
- **Unified Configuration**: Single [llms.json](https://github.com/ServiceStack/llms/blob/main/llms.json) to configure all providers and models
154+
- **Unified Configuration**: Single [llms.json](https://github.com/ServiceStack/llms/blob/main/llms/llms.json) to configure all providers and models
155155

156156
## 🎯 Use Cases
157157

MyApp/_posts/2025-10-01_llms-py-ui.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -39,9 +39,9 @@ Providers - for a single unified interface for accessing both local and premium
3939
## Configuration
4040

4141
You can configure which OpenAI compatible providers and models you want to use by adding them to your
42-
[llms.json](https://github.com/ServiceStack/llms/blob/main/llms.json) in `~/.llms/llms.json`
42+
[llms.json](https://github.com/ServiceStack/llms/blob/main/llms/llms.json) in `~/.llms/llms.json`
4343

44-
Whilst the [ui.json](https://github.com/ServiceStack/llms/blob/main/ui.json) configuration for the UI is maintained
44+
Whilst the [ui.json](https://github.com/ServiceStack/llms/blob/main/llms/ui.json) configuration for the UI is maintained
4545
in `~/.llms/ui.json` where you can configure your preferred system prompts and other defaults.
4646

4747
### Fast, Local and Private

0 commit comments

Comments
 (0)