What are the system requirements for running AgenticSeek with ollama #433
Replies: 2 comments
-
|
RTX 4060 has 8 GB vram which really is the bare bare minimum for running AgenticSeek but is likely to experience out of memory issues with a 7B (billion parameters) model, I advice you play around with smaller model around 3B like gemma or llama 3b(https://ollama.com/library/llama3.2:3b) |
Beta Was this translation helpful? Give feedback.
-
|
That hardware should be enough to explore the project, but the model choice will matter a lot more than the framework itself. On a 4060, I would start with smaller quantized models and get a feel for latency before assuming a broader local setup will stay comfortable. I have seen setups like this feel fine on simple tasks and then slow down sharply once the workload becomes more agentic or multi-step. So I would benchmark with the exact kind of task you care about, not just a basic smoke test. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I want to install agentic seek, but I have a question to ask that what are the system requirements of running AgenticSeek with ollama.
Btw, my pc specs:
CPU : AMD Ryzen 7 7700
GPU : NVIDIA RTX 4060 8 GB
RAM : 32 GB DDR5, 6400MHz
SSD : 1 TB Gen 4 NVMe
MOB : MSI PRO B650-S Wi-Fi
Beta Was this translation helpful? Give feedback.
All reactions