chore(deps): update dependency litellm to v1.61.15 [security]#201
Open
renovate[bot] wants to merge 1 commit intodevelopfrom
Open
chore(deps): update dependency litellm to v1.61.15 [security]#201renovate[bot] wants to merge 1 commit intodevelopfrom
renovate[bot] wants to merge 1 commit intodevelopfrom
Conversation
|
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
|
Here's the code health analysis summary for commits Analysis Summary
|
5e217b6 to
4878e30
Compare
4878e30 to
d798ec0
Compare
|
Important Review skippedBot user detected. To trigger a single review, invoke the You can disable this status message by setting the 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Join our Discord community for assistance with any issues or questions. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This PR contains the following updates:
==1.44.8→==1.61.15GitHub Vulnerability Alerts
CVE-2024-10188
A vulnerability in BerriAI/litellm, as of commit 26c03c9, allows unauthenticated users to cause a Denial of Service (DoS) by exploiting the use of ast.literal_eval to parse user input. This function is not safe and is prone to DoS attacks, which can crash the litellm Python server.
CVE-2025-0628
An improper authorization vulnerability exists in the main-latest version of BerriAI/litellm. When a user with the role 'internal_user_viewer' logs into the application, they are provided with an overly privileged API key. This key can be used to access all the admin functionality of the application, including endpoints such as '/users/list' and '/users/get_users'. This vulnerability allows for privilege escalation within the application, enabling any account to become a PROXY ADMIN.
CVE-2024-9606
In berriai/litellm before version 1.44.12, the
litellm/litellm_core_utils/litellm_logging.pyfile contains a vulnerability where the API key masking code only masks the first 5 characters of the key. This results in the leakage of almost the entire API key in the logs, exposing a significant amount of the secret key. The issue affects version v1.44.9.CVE-2024-8984
A Denial of Service (DoS) vulnerability exists in berriai/litellm version v1.44.5. This vulnerability can be exploited by appending characters, such as dashes (-), to the end of a multipart boundary in an HTTP request. The server continuously processes each character, leading to excessive resource consumption and rendering the service unavailable. The issue is unauthenticated and does not require any user interaction, impacting all users of the service.
Release Notes
BerriAI/litellm (litellm)
v1.61.7What's Changed
return_citationsdocumentation by @miraclebakelaser in #8527/bedrock/meta.llama3-3-70b-instruct-v1:0tool calling support + cost tracking + base llm unit test for tool calling by @ishaan-jaff in #8545/completionsroute by @ishaan-jaff in #8551x-litellm-attempted-fallbacksin responses from litellm proxy by @ishaan-jaff in #8558New Contributors
Full Changelog: BerriAI/litellm@v1.61.3...v1.61.7
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.61.3What's Changed
/modelsand/model_group/infoby @krrishdholakia in #8473include_usagefor /completions requests + unit testing by @ishaan-jaff in #8484PerplexityChatConfig- track correct OpenAI compatible params by @ishaan-jaff in #8496-nightlyby @krrishdholakia in #8499gemini-2.0-pro-exp-02-05vertex ai model to cost map + Newbedrock/deepseek_r1/*route by @krrishdholakia in #8525Full Changelog: BerriAI/litellm@v1.61.1...v1.61.3
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.61.1Compare Source
What's Changed
Full Changelog: BerriAI/litellm@v1.61.0...v1.61.1
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.61.0What's Changed
/bedrock/invoke/by @ishaan-jaff in #8397/team/updates in multi-instance deployments with Redis by @ishaan-jaff in #8440New Contributors
Full Changelog: BerriAI/litellm@v1.60.8...v1.61.0
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.8What's Changed
/cache/ping+ add timeout value and elapsed time on azure + http calls by @krrishdholakia in #8377/bedrock/invokesupport for all Anthropic models by @ishaan-jaff in #8383Full Changelog: BerriAI/litellm@v1.60.6...v1.60.8
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.6Compare Source
What's Changed
choices=[]by @ishaan-jaff in #8339choices=[]on llm responses by @ishaan-jaff in #8342New Contributors
Full Changelog: BerriAI/litellm@v1.60.5...v1.60.6
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.5Compare Source
What's Changed
BaseLLMHTTPHandlerclass by @ishaan-jaff in #8290New Contributors
Full Changelog: BerriAI/litellm@v1.60.4...v1.60.5
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.4Compare Source
What's Changed
bedrock/novamodels + add utillitellm.supports_tool_choiceby @ishaan-jaff in #8264rolebased access to proxy by @krrishdholakia in #8260Full Changelog: BerriAI/litellm@v1.60.2...v1.60.4
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.2Compare Source
What's Changed
sso_user_idto LiteLLM_UserTable by @krrishdholakia in #8167/vertex_ai/was not detected as llm_api_route on pass through butvertex-aiwas by @ishaan-jaff in #8186modeas list, fix valid keys error in pydantic, add more testing by @krrishdholakia in #8224New Contributors
Full Changelog: BerriAI/litellm@v1.60.0...v1.60.2
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.0What's Changed
Important Changes between v1.50.xx to 1.60.0
def async_log_stream_eventanddef log_stream_eventno longer supported forCustomLoggershttps://docs.litellm.ai/docs/observability/custom_callback. If you want to log stream events usedef async_log_success_eventanddef log_success_eventfor logging success stream eventsKnown Issues
🚨 Detected issue with Langfuse Logging when Langfuse credentials are stored in DB
bedrockmodels + showend_userby @ishaan-jaff in #8118keyTeam.team_alias === "Default Team"by @ishaan-jaff in #8122LoggingCallbackManagerto append callbacks and ensure no duplicate callbacks are added by @ishaan-jaff in #8112litellm.disable_no_log_paramparam by @krrishdholakia in #8134litellm.turn_off_message_logging=Trueby @ishaan-jaff in #8156New Contributors
Full Changelog: BerriAI/litellm@v1.59.10...v1.60.0
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.59.10Compare Source
What's Changed
modelparam by @ishaan-jaff in #8105bedrock/converse_like/<model>route by @krrishdholakia in #8102Full Changelog: BerriAI/litellm@v1.59.9...v1.59.10
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.59.9Compare Source
What's Changed
metadataparam preview support + newx-litellm-timeoutrequest header by @krrishdholakia in #8047New Contributors
Configuration
📅 Schedule: Branch creation - "" (UTC), Automerge - At any time (no schedule defined).
🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.
♻ Rebasing: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
This PR was generated by Mend Renovate. View the repository job log.