-
Notifications
You must be signed in to change notification settings - Fork 5.9k
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
I've tried to use --rpc=10.x.x.x(serverip):50052 in extra-flags, either in the interface or in command line or in CMD_FLAGS from user_data directory, rpc parameter never reach llama.cpp
Is there an existing issue for this?
- I have searched the existing issues
Reproduction
ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no
ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
ggml_cuda_init: found 2 CUDA devices:
Device 0: NVIDIA GeForce RTX 5090, compute capability 12.0, VMM: yes
Device 1: NVIDIA GeForce RTX 3090, compute capability 8.6, VMM: yes
error: invalid argument: -rpc
19:09:54-049135 ERROR Error loading the model with llama.cpp: Server process terminated unexpectedly with exit code:
1
Screenshot
No response
Logs
ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no
ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
ggml_cuda_init: found 2 CUDA devices:
Device 0: NVIDIA GeForce RTX 5090, compute capability 12.0, VMM: yes
Device 1: NVIDIA GeForce RTX 3090, compute capability 8.6, VMM: yes
error: invalid argument: -rpc
19:09:54-049135 ERROR Error loading the model with llama.cpp: Server process terminated unexpectedly with exit code:
1System Info
Ultra9 224 GB ram + 5090 + 3090 in TB5 + secondary computer that I want to use with RPC capabilities.Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working