Is it possible to have a Mixture of Experts (MoE) in ComfyUI? #4643
Replies: 1 comment 4 replies
-
That is an interesting topic. If we approach this more macroscopically rather than at the network level, it would be possible to use different checkpoints/LoRA etc. depending on the prompt interpreter. With a bit more application, it could be extended to include on-the-fly model merge functionality. If we want to go to an even lower level than that, we'll need to consider resources and conduct more in-depth research on architecture and model patching. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I’m currently interested in exploring advanced techniques for model optimization and scalability, and I’m interested in utilizing a Mixture of Experts (MoE) approach within ComfyUI. Specifically, I want to know if ComfyUI supports this type of architecture either out-of-the-box or through custom extensions.
If this isn’t directly supported, are there any recommended practices or workarounds that would allow for implementing MoE in ComfyUI? Additionally, I would appreciate any guidance on how to efficiently route different inputs to specialized experts and manage the gating mechanisms within the UI.
Looking forward to any insights or examples.
Beta Was this translation helpful? Give feedback.
All reactions