-
Notifications
You must be signed in to change notification settings - Fork 5
Description
Hi @pamelafox
I use message_builder in code that relies on the tool_calls feature of GPT.
model_helper.py assumes that certain keys (if present) cannot be set to None in the return value of the ChatCompletion API:
raise ValueError(f"Could not encode unsupported message value type: {type(value)}") |
Example return value:
ChatCompletionMessage(content='Inform, assist, connect with legal experts, facilitate paperwork.', refusal=None, role='assistant', audio=None, function_call=None, tool_calls=None)
Keys set to None are routinely returned by the model, in my case gpt-4o 2024-08-06 .
It seems to be a bug which is suppressed in other parts of the code if tool_calls are not used. I.e. the code omits None keys before they hit this code line. However, for tool_calls to work correctly I need to pass back to the model what it has returned previously. OpenAI API returns an error if I omit certain parts of its previous returns.
Is tool_calls and openai_messages_token_helper together a niche scenario or is it worth fixing in the repo?