-
Notifications
You must be signed in to change notification settings - Fork 474
Open
Description
Background & Description
With some new LLMs, there are new features called tool calling, such as with Qwen 4B 2507 Instruct with Qwen Agent. I would like LlamaSharp to harness this power and become more agentic, as shown in this Python example:
from qwen_agent.agents import Assistant
llm_cfg = {
'model': 'Qwen3-4B-Instruct-2507',
'model_server': 'http://localhost:8000/v1', # api_base
'api_key': 'EMPTY',
}
tools = [
{
'mcpServers': {
'time': {
'command': 'uvx',
'args': ['mcp-server-time', '--local-timezone=Asia/Shanghai']
},
'fetch': {
'command': 'uvx',
'args': ['mcp-server-fetch']
}
}
},
'code_interpreter',
]
bot = Assistant(llm=llm_cfg, function_list=tools)
messages = [{
'role': 'user',
'content': 'https://qwenlm.github.io/blog/ Introduce the latest developments of Qwen'
}]
for responses in bot.run(messages=messages):
pass
print(responses)
This way we can get more out of LlamaSharp inference and help the future of agentic AI.
API & Usage
Maybe you could try integrating Qwen Agent or a similar solution into LlamaSharp.
How to implement
No response
Webslug
Metadata
Metadata
Assignees
Labels
No labels