-
|
I am using beeai 0.2.1 version. Here is how my beeai env is setup: Error I see is: For For |
Beta Was this translation helpful? Give feedback.
Replies: 9 comments 25 replies
-
|
Hi @neehar18 - thanks for reporting! This should be resolved with latest Note: if you have previously installed |
Beta Was this translation helpful? Give feedback.
-
|
@jenna-winkler thanks for the prompt response. Appreciate it. I am still seeing the same error after upgrade: I also, tried with the default env settings (instead of manually setting it up as mentioned in the docs), still no luck: |
Beta Was this translation helpful? Give feedback.
-
|
@neehar18 can you confirm that Ollama is downloaded, running, with the model you want to run pulled? |
Beta Was this translation helpful? Give feedback.
-
|
@jenna-winkler yes, Ollama is downloaded and running. I am using IBM's granite3.3:2b (i've also tried with 3.3:8b, and notice the same issue there as well). |
Beta Was this translation helpful? Give feedback.
-
|
Could you try There was a bug in an older version during VM creation that host.docker.internal was not correctly resolving. |
Beta Was this translation helpful? Give feedback.
-
|
@jezekra1 I tried deleting the platform and starting again isn't working either. I did the following:
|
Beta Was this translation helpful? Give feedback.
-
|
@jenna-winkler @jezekra1 any way I can get this working? I have tried everything. I have uninstalled, did a brew cleanup, brew clear cache, reinstalled. Nothing seems to be working. |
Beta Was this translation helpful? Give feedback.
-
|
At this point, I'm not sure if it's my environment which got broken because of so many uninstalls and installs. Or if there is a problem with the version. I would also appreciate a clear uninstall/cleanup documentation. |
Beta Was this translation helpful? Give feedback.
-
|
Thanks a lot @jezekra1 for the constant follow up messages!! You have restored my faith in the open source community!! Also, thanks to @jenna-winkler @tomkis |
Beta Was this translation helpful? Give feedback.

Ok we have two limactl processes listening on 8333, we need to stop one of them. I have no idea where is the other lima instance hiding if it's not in the default location and not in ~/.beeai
What I would try is:
reboot laptop, try curl localhost:8333 immediately after boot or lsof -i again to see if the process is starting automatically. If not - problem solved.
list what command is running under the PIDs from lsof where the ports are in LISTEN mode, not CLOSED
kill the processes or find the running instance and stop it using limactl
If this doesn't help, let's just jump on a call, it'll be quicker than copy pasting commands output back and forth. I think we …