Skip to content

Add LiteLLM proxy service and config#374

Open
NoorChasib wants to merge 2 commits intomainfrom
llm-server-update
Open

Add LiteLLM proxy service and config#374
NoorChasib wants to merge 2 commits intomainfrom
llm-server-update

Conversation

@NoorChasib
Copy link
Collaborator

Introduce an OpenAI-compatible LiteLLM proxy: add llm_proxy service to .docker/compose.controller.yaml (port 8080, depends on llm_server, mounts config file, uses LITELLM_MASTER_KEY for client auth and LLM_API_KEY for backend auth). Add LLM_API_KEY and LITELLM_MASTER_KEY to .docker/.env-template (with generation hint). Add .docker/llm/litellm_config.yaml listing model_name -> backend mappings and general master_key/setting entries to route requests to various llm_server endpoints.

🎯 Summary

ZUBA-

🔰 Checklist

  • I have read and agree with the following checklist
  • I have performed a self-review of my code.
  • I have commented my code, particularly in hard-to-understand areas.
  • I have made corresponding changes to the documentation where required.
  • I have tested my changes to the best of my ability.
  • I have consulted with the team if introducing a new dependency.

Introduce an OpenAI-compatible LiteLLM proxy: add llm_proxy service to .docker/compose.controller.yaml (port 8080, depends on llm_server, mounts config file, uses LITELLM_MASTER_KEY for client auth and LLM_API_KEY for backend auth). Add LLM_API_KEY and LITELLM_MASTER_KEY to .docker/.env-template (with generation hint). Add .docker/llm/litellm_config.yaml listing model_name -> backend mappings and general master_key/setting entries to route requests to various llm_server endpoints.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant