Replies: 5 comments 6 replies
-
|
In terms of cost, you can get a used GTX 1070 for around $100 and get excellent performance for STT and TTS, with around 450-700ms response times for STT. Processor isn't really a factor so you could go with something like an i5 for instance that has a low TDP which would be lowet power. For example I run proxmox on my system with 16gb ram, a 500gb m.2 drive, and an i5 quad core along with a GTX 1070, and my system uses around 40 watts most of the time. I am running home assistant and a docker server on here, which is running WIS and WAS. When I make an inference request the GPU will spike to 60w or so for a second or two and then jump right back to 9w idle. So if you can find a mid tower that could fit a GTX 1070 that may work. 🙂 Also note because the system is mostly idle it has low thermal requirements in general. |
Beta Was this translation helpful? Give feedback.
-
|
I'm excited just by imagining having it running all that. But I'm searching for mid towers on google, they're like 45x45x23 cm, is this about the size you have? Unfortunately, a bit too big for me, I live in a small apartment =( |
Beta Was this translation helpful? Give feedback.
-
|
hm, that is actually a great idea. It's been many years since I last
assembled a computer myself, but this is worth it!
Thank you
Em ter., 11 de jul. de 2023 às 03:17, Nick Bento ***@***.***>
escreveu:
… Could also see if you can squeeze a GTX 1070 into a well structured itx
chassis?
—
Reply to this email directly, view it on GitHub
<#230 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAH4QLSRVEJ6MZZPLISL4KDXPSSTHANCNFSM6AAAAAA2FGV6DI>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
|
I've run a 1060 and a 1070 on Razor and Sonnet eGPUs connected to laptop or Intel NUC. So long as the computer has a Thunderbolt 3 connection this works. It can be a bit fiddly getting the NVidia driver to load, and fixing any conflict that may occur with the internal graphics. I'm running Ubuntu 22.04 and the latest (535) driver (which was necessary to get an RTX 4090 to run in this configuration). Note that the 1070 sized GPUs aren't big enough for chatbots like Vicuna 13B, so if you're interested in that later on you may need two GPUs or a single big one (e.g. 3090 or 4090). |
Beta Was this translation helpful? Give feedback.
-
|
Given the performance requirements if you are looking for a tiny form factor I feel like Intel Nucs might be your best bet. They have good built in GPUs that should be able to handle WIS. The only problem is I don't know how much they are going for used and Intel recently announced they are no longer making them. |
Beta Was this translation helpful? Give feedback.

Uh oh!
There was an error while loading. Please reload this page.
-
Hey, I'd like to host a WIS instance, and had the idea to do it in my raspberry pi 4. But it looks like that won´t work and I'll need a much more powerful hardware with a decent GPU, right?
Still, I don't want to have a full-sized computer/laptop running all the time to do that. Is there any compact option you suggest that would run WIS well and not be too expensive?
Beta Was this translation helpful? Give feedback.
All reactions