Skip to content

Commit 258b674

Browse files
committed
fix deepinfra test
1 parent 385206c commit 258b674

File tree

3 files changed

+8
-2
lines changed

3 files changed

+8
-2
lines changed

litellm/model_prices_and_context_window_backup.json

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15702,7 +15702,8 @@
1570215702
"cache_read_input_token_cost": 2.16e-07,
1570315703
"litellm_provider": "deepinfra",
1570415704
"mode": "chat",
15705-
"supports_tool_choice": true
15705+
"supports_tool_choice": true,
15706+
"supports_reasoning": true
1570615707
},
1570715708
"deepinfra/google/gemini-2.0-flash-001": {
1570815709
"max_tokens": 1000000,

model_prices_and_context_window.json

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15702,7 +15702,8 @@
1570215702
"cache_read_input_token_cost": 2.16e-07,
1570315703
"litellm_provider": "deepinfra",
1570415704
"mode": "chat",
15705-
"supports_tool_choice": true
15705+
"supports_tool_choice": true,
15706+
"supports_reasoning": true
1570615707
},
1570715708
"deepinfra/google/gemini-2.0-flash-001": {
1570815709
"max_tokens": 1000000,

tests/test_litellm/llms/deepinfra/test_deepinfra_chat_transformation.py

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,10 @@ def test_deepseek_supported_openai_params():
1717
"""
1818
from litellm.llms.deepinfra.chat.transformation import DeepInfraConfig
1919

20+
# Ensure we're using the local model cost map
21+
os.environ["LITELLM_LOCAL_MODEL_COST_MAP"] = "True"
22+
litellm.model_cost = litellm.get_model_cost_map(url="")
23+
2024
supported_openai_params = DeepInfraConfig().get_supported_openai_params(model="deepinfra/deepseek-ai/DeepSeek-V3.1")
2125
print(supported_openai_params)
2226
assert "reasoning_effort" in supported_openai_params

0 commit comments

Comments
 (0)