Replies: 1 comment
-
GPT4All is based on llama.cpp, which has OpenMPI and OpenSHMEM support. You should try their |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Is it possible to run gpt4all on a linux based cluster to make gpt4 faster, I have 6 systems that run a openMPI cluster and would like to be able to run gpt4all across the cluster with 24gigs on the main system and 8 gigs on the other 5 systems.
Beta Was this translation helpful? Give feedback.
All reactions