Skip to content

Tool Calls not working using VLLM #477

@MarkoBL

Description

@MarkoBL

Bug Report

Hi, I'm using this library together with vllm, which has an OpenAI-compatible server together with Mistral Small. Until now, everything worked great. Today, I tried tool calling on the ChatEndpoint, but instead of the tools being called, I got the following text output:

Input:
What's the weather in Chicago and London?
Output

[TOOL_CALLS]GetCurrentWeatherAsync_fb41a1e1cb8414e6de6f649aadbbf3f0{"location": "Chicago", "unit": "Celsius"}
[TOOL_CALLS]GetCurrentWeatherAsync_fb41a1e1cb8414e6de6f649aadbbf3f0{"location": "London", "unit": "Celsius"}

I know, this is probably out of scope for this library. Maybe you can help me to identify the problem.

To Reproduce

This is the code I'm using. I'm using StreamCompletionAsync.

            var tools = new List<Tool>
            {
                Tool.GetOrCreateTool(typeof(WeatherService), nameof(WeatherService.GetCurrentWeatherAsync))
            };

            var chatRequest = new ChatRequest(_messages, tools, "auto", Model.Id);

            var sequence = 0L;
            var response = await Api.ChatEndpoint.StreamCompletionAsync(chatRequest, async chatStreamingResponse =>
            {
                if (Streaming != null)
                {
                    var delta = chatStreamingResponse.FirstChoice?.Delta?.Content;
                    if (delta != null && delta.Length > 0)
                    {
                        await Streaming.Invoke(sequence, delta, streamingData, this);
                        ++sequence;
                    }
                }
            }, false, _tokenSource.Token);

Expected behavior

The tools should be called.

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions