Build Out of Memory on DigitalOcean – Vectorization Engine Using Doc #2049
Replies: 1 comment
-
Hi @Samsuthin-Kaavian! I'm Dosu and I’m helping the docling team. The large Docker image and "Build Out Of Memory" error are likely caused by Docling's heavy dependencies (especially PyTorch) and automatic model downloads during the build, which can spike memory usage and bloat the image size—even if everything works locally. This is a common issue with Docling deployments on resource-constrained platforms like DigitalOcean [source]. To optimize:
If you can share your Dockerfile or more build logs, I can give more targeted advice. But the biggest wins will come from moving model downloads out of the build step and tuning your dependency list and concurrency settings. To reply, just mention @dosu. How did I do? Good | Irrelevant | Incorrect | Verbose | Hallucination | Report 🐛 | Other |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hey team,
I’m currently working on a vectorization engine for document processing using Docling. While everything builds fine locally, I encountered a "Build Out Of Memory" error when deploying it to DigitalOcean.
Surprisingly, the Docker image size is 4.24 GB, which seems quite large. Here's my current Pipfile for reference:
toml
Copy
Edit
[[source]]
url = "https://pypi.org/simple"
verify_ssl = true
name = "pypi"
[scripts]
start = "fastapi dev index.py"
[packages]
fastapi = {extras = ["standard"], version = ""}
uvicorn = ""
openai = ""
python-dotenv = ""
qdrant-client = ""
docling = ""
boto3 = ""
requests = ""
jmespath = ""
tiktoken = ""
[dev-packages]
[requires]
python_version = "3.12"
If you need any additional details (logs, Dockerfile, etc.), feel free to ask. I’d appreciate any suggestions on optimizing the build or reducing the image size.
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions