Skip to content
Discussion options

You must be logged in to vote

Thanks for sharing, we can surmise a lot from this.

2025-02-01T03:02:18.218Z debug: Making request to https://api.portkey.ai/v1/chat/completions

This shows that the request is correctly going to your portkey endpoint for each request being made including the title.

The "OpenAI" verbiage is just for the relevant backend part being used, since custom endpoints share functionality with OpenAI due to being "OpenAI-like" or OpenAI API-compatible.

The high tokens come from the default artifacts prompt, which is indeed lengthy and you likely have the feature enabled.

Lastly, the title may be failing as Google may not like a single System message being sent as part of the chat history here:

model…

Replies: 2 comments 3 replies

Comment options

You must be logged in to vote
3 replies
@kevindaffaarr
Comment options

@danny-avila
Comment options

Answer selected by kevindaffaarr
@kevindaffaarr
Comment options

Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants