Skip to content

Conversation

@skoob13
Copy link
Contributor

@skoob13 skoob13 commented Jun 12, 2025

Problem

The LangChain callback doesn't count reasoning and cache write/read tokens.

Changes

  • Add cache write and read tokens.
  • Add reasoning tokens.
  • Add unit tests.
  • Update langchain versions in dev.

@skoob13 skoob13 requested review from a team and k11kirky June 12, 2025 10:15
Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PR Summary

Enhanced LangChain callback token tracking by adding comprehensive monitoring for reasoning and cache operations in the PostHog Python SDK.

  • Added cache token tracking in posthog/ai/langchain/callbacks.py to monitor both read and write operations across different LLM providers
  • Implemented reasoning token tracking with support for models like o1-mini and structured handling through a new ModelUsage dataclass
  • Added detailed test cases in test_callbacks.py covering OpenAI and Anthropic token tracking scenarios
  • Updated minimum version requirements for LangChain dependencies in pyproject.toml to support new token tracking features

3 files reviewed, no comments
Edit PR Review Bot Settings | Greptile

else captured_count
) # For Bedrock, the token count is a list when streamed

parsed_usage[type_key] = final_count

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nitpick (not related to the change): Could captured_count (in captured_count = usage[model_key]) be None in any case? Should we add a null check here to prevent potential exceptions?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It can be None. The code won't crash even though it's None, or did I miss the spot?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nope, nvm, overthinking :)

@skoob13 skoob13 requested a review from Twixes June 12, 2025 10:47
@skoob13 skoob13 force-pushed the feat/reasoning-cached-tokens branch from 3baccfe to 2a6f4f8 Compare June 13, 2025 09:06
@skoob13 skoob13 merged commit 52df246 into master Jun 13, 2025
7 checks passed
@skoob13 skoob13 deleted the feat/reasoning-cached-tokens branch June 13, 2025 13:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants