-
-
Notifications
You must be signed in to change notification settings - Fork 323
Open
Labels
bugSomething isn't workingSomething isn't working
Description
OS
Linux
GPU Library
CUDA 12.x
Python version
3.11
Pytorch version
Unsure
Model
ReadyArt/TheDrummer_Behemoth-ReduX-123B-v1-EXL2
Describe the bug
3x3090 works flawlessly and has for months.
I threw my 2080ti into the setup and it produces gibberish.
I am unsure at this point if the issue is TabbyAPI or defective vram on the card. However, llamacpp does not have this issue.
Reproduction steps
Add a 2080ti into the mix with multiple 3090
Expected behavior
works
Logs

Additional context
Running on:
Debian 12
Acknowledgements
- I have looked for similar issues before submitting this one.
- I understand that the developers have lives and my issue will be answered when possible.
- I understand the developers of this program are human, and I will ask my questions politely.
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working