Skip to content

Conversation

@par4m
Copy link
Contributor

@par4m par4m commented Jun 17, 2025

This PR aims to add support for LiteLLM as a Proxy to run multiple LLMs as mentioned in the issue #627

Defaults url to http://localhost:4000
To Test,

Install LiteLLM Proxy - pip install 'litellm[proxy]'

  1. OpenAI, create config.yml
model_list:
  - model_name: "*"             # all requests where model not in your config go to this deployment
    litellm_params:
      model: openai/*           # set `openai/` to use the openai route
      api_key: os.environ/OPENAI_API_KEY
  1. DeepSeek
    Run DeepSeek - ollama pull deepseek-r1
    config.yml
model_list:
  - model_name: "deepseek-r1"
    litellm_params:
      model: "ollama_chat/deepseek-r1"
      api_base: "http://localhost:11434"

Run LiteLLM
litellm --config config.yml

In main.py set LLM Accordingly

cocoindex.functions.ExtractByLlm(
                llm_spec=cocoindex.LlmSpec(
                    api_type=cocoindex.LlmApiType.LITELLM,
                    model="deepseek-r1",  # or whatever model is set on your LiteLLM proxy
                    # address can be omitted to use http://0.0.0.0:4000 by default
                ),

@badmonster0
Copy link
Member

Can you also help to update the documentation to describe how to use it there? Thanks!

@par4m
Copy link
Contributor Author

par4m commented Jun 18, 2025

Can you also help to update the documentation to describe how to use it there? Thanks!

Sure! Should I include both DeepSeek (ollama) and OpenAI examples ?

@badmonster0
Copy link
Member

Can you also help to update the documentation to describe how to use it there? Thanks!

Sure! Should I include both DeepSeek (ollama) and OpenAI examples ?

Sounds good. Thanks!

@badmonster0
Copy link
Member

Thanks for the PR!

@badmonster0 badmonster0 merged commit 95c8479 into cocoindex-io:main Jun 18, 2025
7 checks passed
@badmonster0
Copy link
Member

Hi @par4m Param, Thank you so much for your contribution.
We made a section for you in our latest release note, we love your contribution and thank you ❤️!

https://cocoindex.io/blogs/cocoindex-changelog-2025-07-07/#par4m

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants