Replies: 1 comment
-
@Linz1248 Thanks! The timeout suggests that the model takes too much time to output the response, probably due to the lengthy chain-of-thought output, as r1 is a reasoning model. If the model works correctly only when the context is small (<= 3 pages), then this model is not the right fit for your extraction pipeline. Did you try using a non-reasoning LLM? Alternatively, if justified by the complexity of your extraction pipeline (e.g. high amount of complex aspects/concepts), you could try increasing the default timeout - still, this would not be a reliable long-term fix. Choosing a more capable model based on the appropriate model architecture (not necessarily CoT-capable) is usually the best solution. See guidance on choosing the right LLM which provides more details on what model architectures are recommended depending on the extraction task. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Your work has been very helpful to me! I have found that the 70B version of the deepseek-r1 model works correctly when the document length is less than 3 pages. However, when it exceeds 3 pages, the following error occurs. What could be the reason?
[contextgem] 2025-07-16 11:46:26.905 | INFO | Text is being segmented into paragraphs, as no Paragraph instances were provided...
[contextgem] 2025-07-16 11:46:26.907 | SUCCESS | Process
Document initialization
finished in 0.0 seconds.[contextgem] 2025-07-16 11:46:26.950 | INFO | Using model ollama_chat/deepseek-r1:70b
[contextgem] 2025-07-16 11:46:26.950 | INFO | API key was not provided. Set
api_key
, if applicable.[contextgem] 2025-07-16 11:46:26.950 | INFO | Using local model provider. If you experience issues like JSON validation errors with smaller models, see our troubleshooting guide: https://contextgem.dev/optimizations/optimization_small_llm_troubleshooting.html
[contextgem] 2025-07-16 11:48:27.390 | ERROR | Exception occurred while calling LLM API: litellm.APIConnectionError: Ollama_chatException - litellm.Timeout: Connection timed out. Timeout passed=120.0, time taken=120.051 seconds LiteLLM Retried: 3 times
Beta Was this translation helpful? Give feedback.
All reactions