Skip to content

Does paper-qa support vllm 0.7.3? #1041

Answered by linkai1208
linkai1208 asked this question in Q&A
Discussion options

You must be logged in to vote

Adding a custom_llm_provider is enough, thank you for your reminder!

settings = Settings(
llm="deepseek-llama3-70b",
llm_config={
"model_list": [
{
"model_name": "deepseek-llama3-70b",
"litellm_params": {
"model": "deepseek-llama3-70b",
"api_base": "http://your-vllm-server:8000/v1",
"api_key": "your-api-key-if-needed",
"custom_llm_provider": "openai",
"temperature": 0.1,
"max_tokens": 512
}
}
]
},
# Set summary_llm and embedding as needed
)

Replies: 1 comment 3 replies

Comment options

You must be logged in to vote
3 replies
@linkai1208
Comment options

@dosubot
Comment options

@linkai1208
Comment options

Answer selected by jamesbraza
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
1 participant