Stable Diffusion does not seem to be working when used with ModelSpecs #2793
Replies: 3 comments
-
Unfortunately, tools are only available for use if you set the endpoint to |
Beta Was this translation helpful? Give feedback.
-
@danny-avila ahh i see, thanks again for quick response (as always), very much appreciated! FYI for you, I think I found something interesting. Or at least it's something I'm glad I got working but don't entirely understand exactly why this is working lol.
I'm glad its working for me, its just a bit confusing why & how both LocalAI and Stable Diffusion WebUI Docker are needed since these are entirely separate containers and do not communicate to one another. LocalAI is doing all the actual heavy-lifting/work of actual image generation, but I just need the Stable Diffusion WebUI Docker container running so that the SD_WEBUI_URL can point to a AUTOMATIC1111 interface. Just bringing this to your attention in case it's helpful! You mentioned a bit ago you are still working on tool usage to make non-openai endpoints work. From my experience/testing:
I'm not at all a developer, so just sharing a homelab user perspective: It'd be great if it'd be possible to set the librechat.yaml file to have the following endpoints:
|
Beta Was this translation helpful? Give feedback.
-
Hi, Can you describe again how you managed to get stable diffusion to work on your local AI? I am having problems. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
Trying to get Stable Diffusion plugin to become available in my self hosted environment. Before I was getting Stable Diffusion plugin to work by using LiteLLM as a proxy for OpenAI to connect over to LocalAI (which does actually work). But I think the newly implemented modelSpecs would be a cleaner way to achieve what I'm trying to setup and it seems like it should be able to work without requiring LiteLLM at all, or at least that's what I understand from reading the new documentation. I understand it as the modelSpec should allow LibreChat to simply directly connect to a StableDiffusion instance
The following is what I am doing:
Steps to Reproduce
and add the following pluginKey to the preset tool:
tools: ["stable-diffusion"]
Expectation:
Plugins will become available via endpointsMenu & will be able to install/use Stable Diffusion
Reality:
Model works as if there is no stable-diffusion tool added, plugins are still not made available.
The following is my modelSpec
Beta Was this translation helpful? Give feedback.
All reactions