Skip to content

Commit e912b89

Browse files
Merge branch 'main' into feat/remove-servername-prefix-mcp_tools
2 parents 7d93856 + e377e30 commit e912b89

File tree

183 files changed

+10382
-1604
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

183 files changed

+10382
-1604
lines changed

.circleci/config.yml

Lines changed: 55 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1050,6 +1050,51 @@ jobs:
10501050
ls
10511051
python -m pytest -vv tests/test_litellm --cov=litellm --cov-report=xml -x -s -v --junitxml=test-results/junit-litellm.xml --durations=10 -n 8
10521052
no_output_timeout: 120m
1053+
- run:
1054+
name: Rename the coverage files
1055+
command: |
1056+
mv coverage.xml litellm_mapped_tests_coverage.xml
1057+
mv .coverage litellm_mapped_tests_coverage
1058+
1059+
# Store test results
1060+
- store_test_results:
1061+
path: test-results
1062+
- persist_to_workspace:
1063+
root: .
1064+
paths:
1065+
- litellm_mapped_tests_coverage.xml
1066+
- litellm_mapped_tests_coverage
1067+
litellm_mapped_enterprise_tests:
1068+
docker:
1069+
- image: cimg/python:3.11
1070+
auth:
1071+
username: ${DOCKERHUB_USERNAME}
1072+
password: ${DOCKERHUB_PASSWORD}
1073+
working_directory: ~/project
1074+
1075+
steps:
1076+
- checkout
1077+
- setup_google_dns
1078+
- run:
1079+
name: Install Dependencies
1080+
command: |
1081+
python -m pip install --upgrade pip
1082+
python -m pip install -r requirements.txt
1083+
pip install "pytest-mock==3.12.0"
1084+
pip install "pytest==7.3.1"
1085+
pip install "pytest-retry==1.6.3"
1086+
pip install "pytest-cov==5.0.0"
1087+
pip install "pytest-asyncio==0.21.1"
1088+
pip install "respx==0.22.0"
1089+
pip install "hypercorn==0.17.3"
1090+
pip install "pydantic==2.10.2"
1091+
pip install "mcp==1.10.1"
1092+
pip install "requests-mock>=1.12.1"
1093+
pip install "responses==0.25.7"
1094+
pip install "pytest-xdist==3.6.1"
1095+
pip install "semantic_router==0.1.10"
1096+
pip install "fastapi-offline==1.7.3"
1097+
- setup_litellm_enterprise_pip
10531098
- run:
10541099
name: Run enterprise tests
10551100
command: |
@@ -1779,8 +1824,8 @@ jobs:
17791824
docker run -d \
17801825
-p 4000:4000 \
17811826
-e DATABASE_URL=postgresql://postgres:[email protected]:5432/circle_test \
1782-
-e AZURE_API_KEY=$AZURE_BATCHES_API_KEY \
1783-
-e AZURE_API_BASE=$AZURE_BATCHES_API_BASE \
1827+
-e AZURE_API_KEY=$AZURE_API_KEY \
1828+
-e AZURE_API_BASE=$AZURE_API_BASE \
17841829
-e AZURE_API_VERSION="2024-05-01-preview" \
17851830
-e REDIS_HOST=$REDIS_HOST \
17861831
-e REDIS_PASSWORD=$REDIS_PASSWORD \
@@ -3175,6 +3220,12 @@ workflows:
31753220
only:
31763221
- main
31773222
- /litellm_.*/
3223+
- litellm_mapped_enterprise_tests:
3224+
filters:
3225+
branches:
3226+
only:
3227+
- main
3228+
- /litellm_.*/
31783229
- litellm_mapped_tests:
31793230
filters:
31803231
branches:
@@ -3219,6 +3270,7 @@ workflows:
32193270
- guardrails_testing
32203271
- llm_responses_api_testing
32213272
- litellm_mapped_tests
3273+
- litellm_mapped_enterprise_tests
32223274
- batches_testing
32233275
- litellm_utils_testing
32243276
- pass_through_unit_testing
@@ -3279,6 +3331,7 @@ workflows:
32793331
- google_generate_content_endpoint_testing
32803332
- llm_responses_api_testing
32813333
- litellm_mapped_tests
3334+
- litellm_mapped_enterprise_tests
32823335
- batches_testing
32833336
- litellm_utils_testing
32843337
- pass_through_unit_testing

.github/workflows/test-mcp.yml

Lines changed: 48 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,48 @@
1+
name: LiteLLM MCP Tests (folder - tests/mcp_tests)
2+
3+
on:
4+
pull_request:
5+
branches: [ main ]
6+
7+
jobs:
8+
test:
9+
runs-on: ubuntu-latest
10+
timeout-minutes: 25
11+
12+
steps:
13+
- uses: actions/checkout@v4
14+
15+
- name: Thank You Message
16+
run: |
17+
echo "### 🙏 Thank you for contributing to LiteLLM!" >> $GITHUB_STEP_SUMMARY
18+
echo "Your PR is being tested now. We appreciate your help in making LiteLLM better!" >> $GITHUB_STEP_SUMMARY
19+
20+
- name: Set up Python
21+
uses: actions/setup-python@v4
22+
with:
23+
python-version: '3.12'
24+
25+
- name: Install Poetry
26+
uses: snok/install-poetry@v1
27+
28+
- name: Install dependencies
29+
run: |
30+
poetry install --with dev,proxy-dev --extras "proxy semantic-router"
31+
poetry run pip install "pytest==7.3.1"
32+
poetry run pip install "pytest-retry==1.6.3"
33+
poetry run pip install "pytest-cov==5.0.0"
34+
poetry run pip install "pytest-asyncio==0.21.1"
35+
poetry run pip install "respx==0.22.0"
36+
poetry run pip install "pydantic==2.10.2"
37+
poetry run pip install "mcp==1.10.1"
38+
poetry run pip install pytest-xdist
39+
40+
- name: Setup litellm-enterprise as local package
41+
run: |
42+
cd enterprise
43+
python -m pip install -e .
44+
cd ..
45+
46+
- name: Run MCP tests
47+
run: |
48+
poetry run pytest tests/mcp_tests -x -vv -n 4 --cov=litellm --cov-report=xml --durations=5

Dockerfile

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -41,9 +41,6 @@ RUN pip uninstall jwt -y
4141
RUN pip uninstall PyJWT -y
4242
RUN pip install PyJWT==2.9.0 --no-cache-dir
4343

44-
# Build Admin UI
45-
RUN chmod +x docker/build_admin_ui.sh && ./docker/build_admin_ui.sh
46-
4744
# Runtime stage
4845
FROM $LITELLM_RUNTIME_IMAGE AS runtime
4946

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ LiteLLM manages:
3737
- Retry/fallback logic across multiple deployments (e.g. Azure/OpenAI) - [Router](https://docs.litellm.ai/docs/routing)
3838
- Set Budgets & Rate limits per project, api key, model [LiteLLM Proxy Server (LLM Gateway)](https://docs.litellm.ai/docs/simple_proxy)
3939

40-
[**Jump to LiteLLM Proxy (LLM Gateway) Docs**](https://github.com/BerriAI/litellm?tab=readme-ov-file#openai-proxy---docs) <br>
40+
[**Jump to LiteLLM Proxy (LLM Gateway) Docs**](https://github.com/BerriAI/litellm?tab=readme-ov-file#litellm-proxy-server-llm-gateway---docs) <br>
4141
[**Jump to Supported LLM Providers**](https://github.com/BerriAI/litellm?tab=readme-ov-file#supported-providers-docs)
4242

4343
🚨 **Stable Release:** Use docker images with the `-stable` tag. These have undergone 12 hour load tests, before being published. [More information about the release cycle here](https://docs.litellm.ai/docs/proxy/release_cycle)
Lines changed: 62 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,62 @@
1+
#!/usr/bin/env python3
2+
"""
3+
Example: Using CLI token with LiteLLM SDK
4+
5+
This example shows how to use the CLI authentication token
6+
in your Python scripts after running `litellm-proxy login`.
7+
"""
8+
9+
from textwrap import indent
10+
import litellm
11+
LITELLM_BASE_URL = "http://localhost:4000/"
12+
13+
14+
def main():
15+
"""Using CLI token with LiteLLM SDK"""
16+
print("🚀 Using CLI Token with LiteLLM SDK")
17+
print("=" * 40)
18+
#litellm._turn_on_debug()
19+
20+
# Get the CLI token
21+
api_key = litellm.get_litellm_gateway_api_key()
22+
23+
if not api_key:
24+
print("❌ No CLI token found. Please run 'litellm-proxy login' first.")
25+
return
26+
27+
print("✅ Found CLI token.")
28+
29+
available_models = litellm.get_valid_models(
30+
check_provider_endpoint=True,
31+
custom_llm_provider="litellm_proxy",
32+
api_key=api_key,
33+
api_base=LITELLM_BASE_URL
34+
)
35+
36+
print("✅ Available models:")
37+
if available_models:
38+
for i, model in enumerate(available_models, 1):
39+
print(f" {i:2d}. {model}")
40+
else:
41+
print(" No models available")
42+
43+
# Use with LiteLLM
44+
try:
45+
response = litellm.completion(
46+
model="litellm_proxy/gemini/gemini-2.5-flash",
47+
messages=[{"role": "user", "content": "Hello from CLI token!"}],
48+
api_key=api_key,
49+
base_url=LITELLM_BASE_URL
50+
)
51+
print(f"✅ LLM Response: {response.model_dump_json(indent=4)}")
52+
except Exception as e:
53+
print(f"❌ Error: {e}")
54+
55+
56+
if __name__ == "__main__":
57+
main()
58+
59+
print("\n💡 Tips:")
60+
print("1. Run 'litellm-proxy login' to authenticate first")
61+
print("2. Replace 'https://your-proxy.com' with your actual proxy URL")
62+
print("3. The token is stored locally at ~/.litellm/token.json")

0 commit comments

Comments
 (0)