Skip to content

Commit 3dd2252

Browse files
committed
Fix default model for agent store app
Summary: Ran the command only to find all-MiniLM-L6-v2 is being selected by default. ``` python -m examples.agent_store.app localhost 5000 Using model: all-MiniLM-L6-v2 * Running on local URL: http://0.0.0.0:7860 ``` Test Plan: Run the command with the change and validate that meta-llama/Llama-3.2-3B-Instruct is used instead ``` python -m examples.agent_store.app localhost 5000 Using model: meta-llama/Llama-3.2-3B-Instruct * Running on local URL: http://0.0.0.0:7860 ```
1 parent e166be6 commit 3dd2252

File tree

1 file changed

+5
-1
lines changed

1 file changed

+5
-1
lines changed

examples/agent_store/api.py

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,11 @@ class AgentChoice(Enum):
3838
class AgentStore:
3939
def __init__(self, host, port) -> None:
4040
self.client = LlamaStackClient(base_url=f"http://{host}:{port}")
41-
available_models = [model.identifier for model in self.client.models.list()]
41+
available_models = [
42+
model.identifier
43+
for model in self.client.models.list()
44+
if model.model_type == "llm"
45+
]
4246
if not available_models:
4347
print(colored("No available models. Exiting.", "red"))
4448
sys.exit(1)

0 commit comments

Comments
 (0)