Skip to content

[Bug]: Docker API LLM Operations / ExtractionsΒ #1611

@Codinator007

Description

@Codinator007

crawl4ai version

0.7.6

Expected Behavior

Same as other people I am not able to get llm extraction to run.
I have setup via docker compose and want to use for llm extraction my local (same vm) vllm or ollama endpoints.
I wish I could turn on LiteLLM Debug Logs because right now i only see this in the logs:

stage_crawl4ai-1  | LiteLLM.Info: If you need to debug this error, use `litellm._turn_on_debug()'.
stage_crawl4ai-1  | 
stage_crawl4ai-1  | 
stage_crawl4ai-1  | Provider List: https://docs.litellm.ai/docs/providers

I tried nearly everything:

  • config.yml with llm config
  • .llm.env with custom LLM_BASE_URL
  • as Provider i tried
    • LLM_PROVIDER=ollama/gpt-oss-20b
    • LLM_PROVIDER=hosted_vllm/gpt-oss-20b
    • LLM_PROVIDER=hosted_vllm/openai/gpt-oss-20b
    • etc.

Current Behavior

no debug logs of litellm
extraction with llm not work.
for example i use /md endpoint with f:llm & q:Extract only x

Is this reproducible?

Yes

Inputs Causing the Bug

Steps to Reproduce

Code snippets

OS

docker ubuntu 22.04

Python version

3.10.12 on the host

Browser

No response

Browser version

No response

Error logs & Screenshots (if applicable)

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions