Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions docs/gen-ai/gen-ai-spans.md
Original file line number Diff line number Diff line change
Expand Up @@ -434,7 +434,7 @@ application needs and maturity:
1. [Default] Don't record instructions, inputs, or outputs.

2. Record instructions, inputs, and outputs on the GenAI spans using corresponding
attributes (`gen_ai.system.instructions`, `gen_ai.input.messages`,
attributes (`gen_ai.system_instructions`, `gen_ai.input.messages`,
`gen_ai.output.messages`).

This approach is best suited for situations where telemetry volume is manageable
Expand All @@ -455,7 +455,7 @@ application needs and maturity:

#### Recording content on attributes

The content captured in `gen_ai.system.instructions`, `gen_ai.input.messages`,
The content captured in `gen_ai.system_instructions`, `gen_ai.input.messages`,
and `gen_ai.output.messages` attributes is likely to be large.

It may contain media, and even in the text form, it may be larger than
Expand Down Expand Up @@ -484,7 +484,7 @@ such as individual message contents, preserving JSON structure.
Instrumentations MAY support user-defined in-process hooks to handle content upload.

The hook SHOULD operate independently of the opt-in flags that control capturing of
`gen_ai.system.instructions`, `gen_ai.input.messages`, and `gen_ai.output.messages`.
`gen_ai.system_instructions`, `gen_ai.input.messages`, and `gen_ai.output.messages`.

If such a hook is supported and configured, instrumentations SHOULD invoke it regardless
of the span sampling decision with:
Expand All @@ -497,7 +497,7 @@ of the span sampling decision with:
The hook implementation SHOULD be able to enrich and modify provided span, instructions,
and message objects.

If instrumentation is configured to also record `gen_ai.system.instructions`,
If instrumentation is configured to also record `gen_ai.system_instructions`,
`gen_ai.input.messages`, and `gen_ai.output.messages` attributes, it SHOULD do it
after calling the hook and SHOULD record values that were potentially modified within
the hook implementation.
Expand All @@ -511,7 +511,7 @@ implementation including

Application or OpenTelemetry distributions MAY also implement content uploading
in the telemetry processing pipeline (in-process or via a collector), based on the
`gen_ai.system.instructions`, `gen_ai.input.messages`, and `gen_ai.output.messages`
`gen_ai.system_instructions`, `gen_ai.input.messages`, and `gen_ai.output.messages`
attributes. Given the potential data volume, it is RECOMMENDED to tune batching
and export settings accordingly in the OpenTelemetry SDK pipeline.

Expand Down
18 changes: 9 additions & 9 deletions docs/gen-ai/non-normative/examples-llm-calls.md
Original file line number Diff line number Diff line change
Expand Up @@ -296,8 +296,8 @@ They are likely to be siblings if there is an encompassing span.
| `gen_ai.usage.output_tokens` | `17` |
| `gen_ai.usage.input_tokens` | `47` |
| `gen_ai.response.finish_reasons`| `["tool_calls"]` |
| `gen_ai.input.messages` | <span id="gen-ai-input-messages-tool-call-span-1">`gen_ai.input.messages` value</span> |
| `gen_ai.output.messages` | <span id="gen-ai-output-messages-tool-call-span-1">`gen_ai.output.messages` value</span> |
| `gen_ai.input.messages` | [`gen_ai.input.messages`](#gen-ai-input-messages-tool-call-span-1) |
| `gen_ai.output.messages` | [`gen_ai.output.messages`](#gen-ai-output-messages-tool-call-span-1) |

<span id="gen-ai-input-messages-tool-call-span-1">`gen_ai.input.messages` value</span>

Expand Down Expand Up @@ -362,8 +362,8 @@ If tool call is [instrumented according to execute-tool span definition](../gen-
| `gen_ai.usage.output_tokens` | `52` |
| `gen_ai.usage.input_tokens` | `97` |
| `gen_ai.response.finish_reasons`| `["stop"]` |
| `gen_ai.input.messages` | <span id="gen-ai-input-messages-tool-call-span-2">`gen_ai.input.messages` value</span> |
| `gen_ai.output.messages` | <span id="gen-ai-output-messages-tool-call-span-2">`gen_ai.output.messages` value</span> |
| `gen_ai.input.messages` | [`gen_ai.input.messages`](#gen-ai-input-messages-tool-call-span-2) |
| `gen_ai.output.messages` | [`gen_ai.output.messages`](#gen-ai-output-messages-tool-call-span-2) |

<span id="gen-ai-input-messages-tool-call-span-2">`gen_ai.input.messages` value</span>

Expand Down Expand Up @@ -424,7 +424,7 @@ If tool call is [instrumented according to execute-tool span definition](../gen-
## System instructions along with chat history (content enabled)

Some providers allow to provide instructions separately from the chat history provided in the inputs
or in addition to `system` (`deveoper`, etc) message provided in the input.
or in addition to `system` (`developer`, etc) message provided in the input.

This example demonstrates en edge case when conflicting instructions are provided
to the OpenAI responses API. In this case instructions are recorded in the `gen_ai.system_instructions` attribute.
Expand Down Expand Up @@ -550,8 +550,8 @@ sequenceDiagram
| `gen_ai.usage.output_tokens` | `44` |
| `gen_ai.usage.input_tokens` | `385` |
| `gen_ai.response.finish_reasons`| `["stop"]` |
| `gen_ai.input.messages` | <span id="gen-ai-input-messages-built-in-tools">`gen_ai.input.messages` value</span> |
| `gen_ai.output.messages` | <span id="gen-ai-output-messages-built-in-tools">`gen_ai.output.messages` value</span> |
| `gen_ai.input.messages` | [`gen_ai.input.messages`](#gen-ai-input-messages-built-in-tools) |
| `gen_ai.output.messages` | [`gen_ai.output.messages`](#gen-ai-output-messages-built-in-tools) |

<span id="gen-ai-input-messages-built-in-tools">`gen_ai.input.messages` value</span>

Expand Down Expand Up @@ -598,8 +598,8 @@ sequenceDiagram
| `gen_ai.usage.output_tokens` | `77` |
| `gen_ai.usage.input_tokens` | `52` |
| `gen_ai.response.finish_reasons`| `["stop", "stop"]` |
| `gen_ai.input.messages` | <span id="gen-ai-input-messages-choices">`gen_ai.input.messages` value</span> |
| `gen_ai.output.messages` | <span id="gen-ai-output-messages-choices">`gen_ai.output.messages` value</span> |
| `gen_ai.input.messages` | [`gen_ai.input.messages`](#gen-ai-input-messages-choices) |
| `gen_ai.output.messages` | [`gen_ai.output.messages`](#gen-ai-output-messages-choices) |

<span id="gen-ai-input-messages-choices">`gen_ai.input.messages` value</span>

Expand Down
Loading