@@ -15,34 +15,36 @@ Feature: llama.cpp server
1515 And 64 server max tokens to predict
1616 And prometheus compatible metrics exposed
1717 And jinja templates are enabled
18- And chat template file ../../../tests/chat/templates/meta-llama-Meta-Llama-3.1-8B-Instruct.jinja
19- Then the server is starting
20- Then the server is healthy
21-
22- Scenario : Health
23- Then the server is ready
24- And all slots are idle
2518
19+ @wip
2620 Scenario Outline : OAI Compatibility w/ required tool
27- Given a model test
21+ Given a chat template file ../../../tests/chat/templates/<template_name> .jinja
22+ And the server is starting
23+ And the server is healthy
24+ And a model test
2825 And <n> max tokens to predict
2926 And a user prompt write a hello world in python
3027 And a tool choice <tool_choice>
3128 And tools <tools>
32- Given an OAI compatible chat completions request with no api error
29+ And an OAI compatible chat completions request with no api error
3330 Then tool <tool_name> is called with arguments <tool_arguments>
3431
3532 Examples : Prompts
36- | n | tool_name | tool_arguments | tool_choice | tools |
37- | 64 | test | {} | required | [{"type ":"function ", "function ": {"name ": "test ", "description ": "", "parameters ": {"type ": "object ", "properties ": {}}}}] |
38- | 16 | ipython | {"code ": "it and "} | required | [{"type ":"function ", "function ": {"name ": "ipython ", "description ": "", "parameters ": {"type ": "object ", "properties ": {"code ": {"type ": "string ", "description ": ""}}, "required ": ["code "]}}}] |
33+ | template_name | n | tool_name | tool_arguments | tool_choice | tools |
34+ | meta -llama -Meta -Llama -3 .1 -8B -Instruct | 64 | test | {} | required | [{"type ":"function ", "function ": {"name ": "test ", "description ": "", "parameters ": {"type ": "object ", "properties ": {}}}}] |
35+ | meta -llama -Meta -Llama -3 .1 -8B -Instruct | 16 | ipython | {"code ": "it and "} | required | [{"type ":"function ", "function ": {"name ": "ipython ", "description ": "", "parameters ": {"type ": "object ", "properties ": {"code ": {"type ": "string ", "description ": ""}}, "required ": ["code "]}}}] |
36+ | meetkai -functionary -medium -v3 .2 | 64 | test | {} | required | [{"type ":"function ", "function ": {"name ": "test ", "description ": "", "parameters ": {"type ": "object ", "properties ": {}}}}] |
37+ | meetkai -functionary -medium -v3 .2 | 64 | ipython | {"code ": "Yes ,"} | required | [{"type ":"function ", "function ": {"name ": "ipython ", "description ": "", "parameters ": {"type ": "object ", "properties ": {"code ": {"type ": "string ", "description ": ""}}, "required ": ["code "]}}}] |
3938
4039 Scenario : OAI Compatibility w/ no tool
41- Given a model test
40+ Given a chat template file ../../../tests/chat/templates/meta-llama-Meta-Llama-3.1-8B-Instruct.jinja
41+ And the server is starting
42+ And the server is healthy
43+ And a model test
4244 And 16 max tokens to predict
4345 And a user prompt write a hello world in python
4446 And a tool choice <tool_choice>
4547 And tools []
46- Given an OAI compatible chat completions request with no api error
48+ And an OAI compatible chat completions request with no api error
4749 Then no tool is called
4850
0 commit comments