You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
| Deprecated | Yes, use `gen_ai.response.text` instead |
347
+
368
348
### ai.seed
369
349
370
350
The seed, ideally models given the same seed and same other parameters will produce the exact same output.
@@ -389,6 +369,30 @@ For an AI model call, the temperature parameter. Temperature essentially means h
389
369
| Example |`0.1`|
390
370
| Deprecated | Yes, use `gen_ai.request.temperature` instead |
391
371
372
+
### ai.tool_calls
373
+
374
+
For an AI model call, the tool calls that were made.
375
+
376
+
| Property | Value |
377
+
| --- | --- |
378
+
| Type |`string[]`|
379
+
| Has PII | true |
380
+
| Exists in OpenTelemetry | No |
381
+
| Example |`["tool_call_1","tool_call_2"]`|
382
+
| Deprecated | Yes, use `gen_ai.response.tool_calls` instead |
383
+
384
+
### ai.tools
385
+
386
+
For an AI model call, the functions that are available
387
+
388
+
| Property | Value |
389
+
| --- | --- |
390
+
| Type |`string[]`|
391
+
| Has PII | false |
392
+
| Exists in OpenTelemetry | No |
393
+
| Example |`["function_1","function_2"]`|
394
+
| Deprecated | Yes, use `gen_ai.request.available_tools` instead |
395
+
392
396
### ai.top_k
393
397
394
398
Limits the model to only consider the K most likely next tokens, where K is an integer (e.g., top_k=20 means only the 20 highest probability tokens are considered).
* The messages passed to the model. The "content" can be a string or an array of objects. It has to be a stringified version of an array of objects. `gen_ai.request.messages`
2023
+
* The messages passed to the model. It has to be a stringified version of an array of objects. The "content" can be a string or an array of objects. `gen_ai.request.messages`
2020
2024
*
2021
2025
* Attribute Value Type: `string` {@link GEN_AI_REQUEST_MESSAGES_TYPE}
2022
2026
*
@@ -5989,21 +5993,17 @@ export type AttributeValue = string | number | boolean | Array<string> | Array<n
"brief": "The response messages sent back by the AI model.",
294
+
"type": "string[]",
295
+
"pii": {
296
+
"key": "false"
297
+
},
298
+
"is_in_otel": false,
299
+
"example": ["hello", "world"],
300
+
"sdks": ["python"],
301
+
"deprecation": {
302
+
"_status": null,
303
+
"replacement": "gen_ai.response.text"
304
+
}
305
+
},
275
306
{
276
307
"key": "ai.seed",
277
308
"brief": "The seed, ideally models given the same seed and same other parameters will produce the exact same output.",
@@ -300,6 +331,34 @@
300
331
"replacement": "gen_ai.request.temperature"
301
332
}
302
333
},
334
+
{
335
+
"key": "ai.tool_calls",
336
+
"brief": "For an AI model call, the tool calls that were made.",
337
+
"type": "string[]",
338
+
"pii": {
339
+
"key": "true"
340
+
},
341
+
"is_in_otel": false,
342
+
"example": ["tool_call_1", "tool_call_2"],
343
+
"deprecation": {
344
+
"_status": null,
345
+
"replacement": "gen_ai.response.tool_calls"
346
+
}
347
+
},
348
+
{
349
+
"key": "ai.tools",
350
+
"brief": "For an AI model call, the functions that are available",
351
+
"type": "string[]",
352
+
"pii": {
353
+
"key": "false"
354
+
},
355
+
"is_in_otel": false,
356
+
"example": ["function_1", "function_2"],
357
+
"deprecation": {
358
+
"_status": null,
359
+
"replacement": "gen_ai.request.available_tools"
360
+
}
361
+
},
303
362
{
304
363
"key": "ai.top_k",
305
364
"brief": "Limits the model to only consider the K most likely next tokens, where K is an integer (e.g., top_k=20 means only the 20 highest probability tokens are considered).",
0 commit comments