ComfyUi Node replicate for LLM; Dev Questions #1266
Unanswered
ArcadiaFrame
asked this question in
Q&A
Replies: 2 comments
-
https://github.com/Ironclad/rivet This one just became open source, there’s an oobabooga text-generation-webui plugin I made for it here: https://github.com/hushaudio/rivet-oobabooga-plugin Endless possibility |
Beta Was this translation helpful? Give feedback.
0 replies
-
Apart from Rivet, you can also use custom ComfyUI node or use FlowiseAI |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am interested in creating a similar interface for the web-ui used with LLM's and see if a node based progression can improve the efficiency and load compacity of llm models either GGML or GPTQ in order to create effective isolated nodes for processing the models and prompts; I was also learning coding in order to understand more about the ai processing and how to effectively troubleshoot back end on local installations I am working with both Stable diffusion and LLM mostly in Obabooga at the moment which is like a automatic111 for llm's. My background is in psychology tailored towards learning, personality, perception and cognition with some knowledge from being an enthusiast PC builder from age 12, my end goal is to program parameters that allow for alternating pathways or establish multiprocessing of models allowing for hallucinations then refining those through another model with stricter response policies and combining them to create cognitive models for more detailed and tailored responses. Eventually being able to see how analytical, Jungian typology or other theories on Information processing modules could be effective in pathways that llm or even ai art generation could follow.
Beta Was this translation helpful? Give feedback.
All reactions