fix: preserve dropped fields in AssistantMessage and ResultMessage#563
Open
wenfengwang wants to merge 1 commit intoanthropics:mainfrom
Open
fix: preserve dropped fields in AssistantMessage and ResultMessage#563wenfengwang wants to merge 1 commit intoanthropics:mainfrom
wenfengwang wants to merge 1 commit intoanthropics:mainfrom
Conversation
31ed4cb to
9c668b0
Compare
The message parser was dropping significant fields from the raw CLI JSON output when constructing AssistantMessage and ResultMessage dataclasses. AssistantMessage now includes: - id: Anthropic API message ID - usage: per-message token usage breakdown - stop_reason: why generation stopped - stop_sequence: matched stop sequence (if any) - session_id: session identifier - uuid: unique message identifier ResultMessage now includes: - model_usage: authoritative per-model usage breakdown (from CLI's modelUsage field), which is more accurate than the aggregate usage field for cost tracking and model identification - stop_reason: why generation stopped - permission_denials: list of denied permissions - uuid: unique message identifier All new fields are optional with None defaults, ensuring full backward compatibility. Fixes anthropics#562 Co-authored-by: Cursor <cursoragent@cursor.com>
b4c24ee to
2a2a42e
Compare
|
+1 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
AssistantMessage:id,usage,stop_reason,stop_sequence,session_id,uuidResultMessage:model_usage(from CLI'smodelUsage),stop_reason,permission_denials,uuidmessage_parser.pyto extract these fields from CLI JSON outputNonedefaults — fully backward compatibleMotivation
The CLI outputs rich JSON with many useful fields, but
message_parser.pywas only extracting a subset. Key losses:modelUsage— The authoritative per-model usage breakdown, more accurate than aggregateusagefor cost tracking (see Python SDK ResultMessage's message.usage['input_tokens'] isn't correct #112 in TypeScript SDK). It's also the only way to identify which model was actually used.AssistantMessage.usage— Per-message token consumption, essential for per-turn cost tracking.uuid— Needed for correlating messages with session transcript files.Full field comparison documented in #562.
Test plan
AssistantMessagewith all fields populated (from real CLI output)AssistantMessageoptional fields defaulting toNoneResultMessagewithmodelUsage,stop_reason,permission_denials,uuidResultMessageoptional fields defaulting toNoneResultMessagewith multiple models inmodelUsageFixes #562
Made with Cursor