-
Notifications
You must be signed in to change notification settings - Fork 352
Open
Description
My understanding is that the current implementation of llm-router supports only these models?
strong_model="gpt-4-1106-preview",
weak_model="anyscale/mistralai/Mixtral-8x7B-Instruct-v0.1",
Is that correct? The routellm/calibrate_threshold.py is just reading this dataset: https://huggingface.co/datasets/routellm/lmsys-arena-human-preference-55k-thresholds
How do I train a router to pick between these two models in Bedrock:
strong_model="meta.llama3-1-8b-instruct-v1:0",
weak_model="meta.llama3-1-405b-instruct-v1:0",
Metadata
Metadata
Assignees
Labels
No labels