Skip to content

Prompty, openai-messages-token-helper and tool calls #2340

@gaborvar

Description

@gaborvar

Hi,
Prompty apparently has some support for tool calls. It appears that prompty.prepare(prompt, data) compacts the calling ('assistant') message and the subsequent return ('tool') message into a single assistant message:

{'role': 'assistant', 'content': 'None\n\ntool:\n{"functionname": true, "result": ["...", "..."]}'}

The additional assistant message breaks the normal user/assistant cadence that promptmanager relies on:

for user_message, assistant_message in zip(remaining_messages[0::2], remaining_messages[1::2]):

Before considering a fix it would be useful to understand the future evolution of the message management part of the project. In addition to its token accounting feature, openai-messages-token-helper has overlapping features with the recently introduced prompty as both repackage messages (e.g. separating and adding back few-shots and latest user message etc.). I understand that tool calls are not prioritized in this project, however the growing code base translates into more maintenance to ensure that tool calls pass through transparently.

The best course of action also depends on prompty's stance towards tool_calls. Will the current support evolve ?

If tool_calls coexistence remains out of scope in this project then I welcome hints where I should look for a comparable scheme which is more friendly for this project. My current code builds tool calls dynamically to intelligently interface with the procedural world. Would structured outputs with response_format fit this project's philosophy better than tool_calls?

Thanks for the great work on the project

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions