llama.cpp server-cuda-b6090 Public Latest
Install from the command line
Learn more about packages
$ docker pull ghcr.io/shalinib-ibm/llama.cpp:server-cuda-b6090
Recent tagged image versions
- 0 Version downloads
- 0 Version downloads
- 0 Version downloads
- 0 Version downloads
- 0 Version downloads
Loading
Sorry, something went wrong.