Skip to content

Commit c76575d

Browse files
Rename project to LiteLLM Companion
1 parent 987fbdb commit c76575d

20 files changed

+40
-40
lines changed

.github/workflows/docker.yml

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -41,9 +41,9 @@ jobs:
4141
file: backend/Dockerfile
4242
push: true
4343
tags: |
44-
ghcr.io/makespacemadrid/litellm-updater-backend:latest
45-
ghcr.io/makespacemadrid/litellm-updater-backend:${{ steps.version.outputs.version }}
46-
ghcr.io/makespacemadrid/litellm-updater-backend:${{ github.sha }}
44+
ghcr.io/makespacemadrid/litellm-companion-backend:latest
45+
ghcr.io/makespacemadrid/litellm-companion-backend:${{ steps.version.outputs.version }}
46+
ghcr.io/makespacemadrid/litellm-companion-backend:${{ github.sha }}
4747
4848
- name: Build and push web image
4949
uses: docker/build-push-action@v6
@@ -52,9 +52,9 @@ jobs:
5252
file: frontend/Dockerfile
5353
push: true
5454
tags: |
55-
ghcr.io/makespacemadrid/litellm-updater-web:latest
56-
ghcr.io/makespacemadrid/litellm-updater-web:${{ steps.version.outputs.version }}
57-
ghcr.io/makespacemadrid/litellm-updater-web:${{ github.sha }}
55+
ghcr.io/makespacemadrid/litellm-companion-web:latest
56+
ghcr.io/makespacemadrid/litellm-companion-web:${{ steps.version.outputs.version }}
57+
ghcr.io/makespacemadrid/litellm-companion-web:${{ github.sha }}
5858
5959
- name: Build and push proxy image
6060
uses: docker/build-push-action@v6
@@ -63,6 +63,6 @@ jobs:
6363
file: Dockerfile
6464
push: true
6565
tags: |
66-
ghcr.io/makespacemadrid/litellm-updater-proxy:latest
67-
ghcr.io/makespacemadrid/litellm-updater-proxy:${{ steps.version.outputs.version }}
68-
ghcr.io/makespacemadrid/litellm-updater-proxy:${{ github.sha }}
66+
ghcr.io/makespacemadrid/litellm-companion-proxy:latest
67+
ghcr.io/makespacemadrid/litellm-companion-proxy:${{ steps.version.outputs.version }}
68+
ghcr.io/makespacemadrid/litellm-companion-proxy:${{ github.sha }}

CLAUDE.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ This file provides guidance to Claude Code (claude.ai/code) when working with co
44

55
## Project Overview
66

7-
LiteLLM Updater runs as two FastAPI services built from shared code:
7+
LiteLLM Companion runs as two FastAPI services built from shared code:
88
- `backend/` → headless sync worker (`backend/sync_worker.py`) that fetches provider models and can push them into LiteLLM on a schedule.
99
- `frontend/` → UI + API (`frontend/api.py`) for manual fetch/push/sync and CRUD over providers/models.
1010

DEFAULT_OPENAI_COMPAT_MODELS.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -435,4 +435,4 @@ See `scripts/register_default_compat_models.sh` for automated registration.
435435

436436
**Document Version:** 1.0
437437
**Last Updated:** 2025-11-29
438-
**Maintainer:** LiteLLM Updater Project
438+
**Maintainer:** LiteLLM Companion Project

LITELLM_MODEL_PARAMETERS.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1198,4 +1198,4 @@ Is the model local/free?
11981198

11991199
**Document Version:** 2.0 (Agent-Optimized)
12001200
**Last Updated:** 2025-11-29
1201-
**Maintainer:** LiteLLM Updater Project
1201+
**Maintainer:** LiteLLM Companion Project

OLLAMA_TO_LITELLM_MAPPING.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -680,4 +680,4 @@ This mapping is based on analysis of the following Ollama models:
680680

681681
**Document Version:** 1.0
682682
**Last Updated:** 2025-11-29
683-
**Maintainer:** LiteLLM Updater Project
683+
**Maintainer:** LiteLLM Companion Project

OPENAI_API_MODELS.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -328,4 +328,4 @@ Expected output:
328328

329329
**Document Version:** 1.0
330330
**Last Updated:** 2025-11-29
331-
**Maintainer:** LiteLLM Updater Project
331+
**Maintainer:** LiteLLM Companion Project

OPENAI_TO_OLLAMA_MAPPING.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -396,4 +396,4 @@ for model in models_to_test:
396396

397397
**Document Version:** 1.0
398398
**Last Updated:** 2025-11-29
399-
**Maintainer:** LiteLLM Updater Project
399+
**Maintainer:** LiteLLM Companion Project

README.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# litellm-updater
1+
# litellm-companion
22

33
A FastAPI service that synchronizes models from Ollama or other LiteLLM/OpenAI-compatible servers into a LiteLLM proxy. It periodically scans upstream sources for models, persists them to a database, and registers them with LiteLLM using the admin API. Includes a web UI for provider management, model editing, and monitoring.
44

@@ -22,7 +22,7 @@ A FastAPI service that synchronizes models from Ollama or other LiteLLM/OpenAI-c
2222

2323
2. **Run the server**
2424
```bash
25-
PORT=8000 litellm-updater
25+
PORT=8000 litellm-companion
2626
# or
2727
PORT=8000 uvicorn litellm_updater.web:create_app --port $PORT
2828
```
@@ -42,8 +42,8 @@ A FastAPI service that synchronizes models from Ollama or other LiteLLM/OpenAI-c
4242
## Running with Docker
4343
- Build the image directly:
4444
```bash
45-
docker build -t litellm-updater .
46-
docker run --rm -e PORT=8000 -p 8000:8000 -v $(pwd)/data:/app/data litellm-updater
45+
docker build -t litellm-companion .
46+
docker run --rm -e PORT=8000 -p 8000:8000 -v $(pwd)/data:/app/data litellm-companion
4747
```
4848

4949
- Or use Docker Compose with the provided `example.env` (copy or override values as needed):

backend/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
"""Backend sync worker service."""
22

3-
__version__ = "0.5.21"
3+
__version__ = "0.5.22"

docker-compose.yml

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
services:
22
# Backend sync worker (no HTTP server)
33
model-updater-backend:
4-
image: ghcr.io/makespacemadrid/litellm-updater-backend:latest
4+
image: ghcr.io/makespacemadrid/litellm-companion-backend:latest
55
build:
66
context: .
77
dockerfile: Dockerfile
@@ -19,7 +19,7 @@ services:
1919

2020
# Frontend API + UI
2121
model-updater-web:
22-
image: ghcr.io/makespacemadrid/litellm-updater-web:latest
22+
image: ghcr.io/makespacemadrid/litellm-companion-web:latest
2323
build:
2424
context: .
2525
dockerfile: Dockerfile
@@ -41,7 +41,7 @@ services:
4141

4242
# OpenAI-compatible proxy (chat/completions)
4343
model-updater-proxy:
44-
image: ghcr.io/makespacemadrid/litellm-updater-proxy:latest
44+
image: ghcr.io/makespacemadrid/litellm-companion-proxy:latest
4545
build:
4646
context: .
4747
dockerfile: Dockerfile
@@ -97,7 +97,7 @@ services:
9797
- litellm
9898
# env-sync:
9999
# build: .
100-
# image: litellm-updater:latest
100+
# image: litellm-companion:latest
101101
# command: >
102102
# python /app/scripts/sync_env.py
103103
# volumes:
@@ -114,7 +114,7 @@ services:
114114
volumes:
115115
- /var/run/docker.sock:/var/run/docker.sock
116116
# depends_on:
117-
# - litellm-updater
117+
# - litellm-companion
118118

119119
networks:
120120
litellm:

0 commit comments

Comments
 (0)