Using multiple different models and one decision model #294
JimmyInsane
started this conversation in
Ideas
Replies: 1 comment
-
Yes this is called a "Mixture Of Experts" approach and is supposedly what GPT4 uses. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi I got an idea popped into my mind, maybe that's somehow doable.
As many models excels in different tasks or offers different kind of answers we could use the following approach (numbers are examples only):
If loading like 4 models, 4 of them getting it's own prime-promt and 3 of them the actual request (+history of course) getting queried one after another, and the 4th model takes the different answers (+history) and decides in what answer to use. I think that's how gpt4 works and may results in a more rich conversation.
So the idea is basically to be able to load multiple smaller models (or maybe bigger if they don't remain in ram after query) which processes the prompt + the decision model.
Beta Was this translation helpful? Give feedback.
All reactions