System prompt not part of model spans #721
Unanswered
robin-fiddler
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
According to the gen_ai OpenTelemetry convention, system prompts/instructions should be represented as events. Also, the attribute key used is "system_prompt" whereas sem conv states it as "gen_ai.system.message".
In Strands, I can currently see the System Prompt only at the Agent span level. It is not being passed into the Model (chat) span.
For comparison, in other frameworks such as LangSmith, the system prompt appears in the model span as system messages. Since system prompts are eventually passed down to the LLM, they should ideally also be included in the Model span to maintain consistency and observability.
Expected Behavior
• The system prompt should be available in the Model (chat) span (in addition to being visible in the Agent span).
• This ensures parity with other frameworks and makes the tracing more consistent with the gen_ai convention.
Suggested Action
• Update the instrumentation so that system prompts are propagated to the Model span.
• Confirm alignment with the gen_ai semantic convention for proper OTEL trace representation.
Agent Span
Model (or chat) Span
Beta Was this translation helpful? Give feedback.
All reactions