How to filter tokens in callback in the on_llm_new_token() OR in the "on_chat_model_stream" event in astream_events? #1795
Unanswered
NewGHUser4321
asked this question in
Q&A
Replies: 1 comment
-
@vbarda any suggestions on this please? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have a simple LangGraph implemented as shown below.

Sometimes, OpenAI gpt-4o is generating some content for tool calls. I don't want these tokens. How can I filter these in the callback handler in on_llm_new_token() OR in the "on_chat_model_stream" event in astream_events?
Below is how the on_llm_new_token() callbacks are implemented
Below is how "on_chat_model_stream" event in astream_events implemented
I'm looking for options to filter out the tokens generated when tool are called specifically
here
OR here
Any help will be appreciated. thank you.
Beta Was this translation helpful? Give feedback.
All reactions