Skip to content

AttributeError: 'LegacyAPIResponse' object has no attribute 'model' #81

@soulcape

Description

@soulcape

Describe your environment

opentelemetry-api 1.39.0
opentelemetry-distro 0.60b0
opentelemetry-exporter-otlp 1.39.0
opentelemetry-exporter-otlp-proto-common 1.39.0
opentelemetry-exporter-otlp-proto-grpc 1.39.0
opentelemetry-exporter-otlp-proto-http 1.39.0
opentelemetry-instrumentation 0.60b0
opentelemetry-instrumentation-aiohttp-client 0.60b0
opentelemetry-instrumentation-aiohttp-server 0.60b0
opentelemetry-instrumentation-asyncio 0.60b0
opentelemetry-instrumentation-dbapi 0.60b0
opentelemetry-instrumentation-grpc 0.60b0
opentelemetry-instrumentation-httpx 0.60b0
opentelemetry-instrumentation-logging 0.60b0
opentelemetry-instrumentation-openai-v2 2.2b0
opentelemetry-instrumentation-requests 0.60b0
opentelemetry-instrumentation-sqlalchemy 0.60b0
opentelemetry-instrumentation-sqlite3 0.60b0
opentelemetry-instrumentation-threading 0.60b0
opentelemetry-instrumentation-tortoiseorm 0.60b0
opentelemetry-instrumentation-urllib 0.60b0
opentelemetry-instrumentation-urllib3 0.60b0
opentelemetry-instrumentation-wsgi 0.60b0
opentelemetry-proto 1.39.0
opentelemetry-sdk 1.39.0
opentelemetry-semantic-conventions 0.60b0
opentelemetry-util-http 0.60b0
langchain-classic 1.0.0
langchain-community 0.4.1
langchain-core 1.1.1
langchain-openai 1.1.0
langchain-text-splitters 1.0.0
loongsuite-instrumentation-langchain 1.0.0

What happened?

When running my demo code and using the local model, the following error occurs: AttributeError: 'LegacyAPIResponse' object has no attribute 'model'

demo.py:

from langchain_core.messages import HumanMessage, SystemMessage
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
        model="Qwen3-32B",  # 根据你的实际模型调整
        base_url="xxxxx",  # 本地模型服务地址
        api_key="sk-aaaa",  # 如果服务不需要认证,可以随意填写
        temperature=0.7,
        max_tokens=1000
    )
messages = [
    SystemMessage(
        content="You are a helpful assistant that translates English to French."
    ),
    HumanMessage(
        content="Translate this sentence from English to French. I love programming."
    ),
]
res = llm.invoke(messages)
print(res)

Steps to Reproduce

log detail:

(loongsuit_env) (base) [root@HikvisionOS loongsuit]# opentelemetry-instrument --traces_exporter console --metrics_exporter console python test1.py
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1764919993.076766 888946 http_connect_handshaker.cc:114] HTTP proxy handshake with ipv4:10.20.84.23:8081 failed: UNKNOWN: HTTP proxy returned response code 403
Transient error StatusCode.UNAVAILABLE encountered while exporting logs to localhost:4317, retrying in 1.10s.
Failed to export logs to localhost:4317, error code: StatusCode.UNAVAILABLE
Traceback (most recent call last):
File "/opt/jlj_file/loongsuit/test1.py", line 24, in
res = llm.with_config({"model": "Qwen3-32B"}).invoke(messages)
File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/langchain_core/runnables/base.py", line 5548, in invoke
return self.bound.invoke(
~~~~~~~~~~~~~~~~~^
input,
^^^^^^
self._merge_configs(config),
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
**{**self.kwargs, **kwargs},
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 398, in invoke
self.generate_prompt(
~~~~~~~~~~~~~~~~~~~~^
[self._convert_input(input)],
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<6 lines>...
**kwargs,
^^^^^^^^^
).generations[0][0],
^
File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 1117, in generate_prompt
return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 927, in generate
self._generate_with_cache(
~~~~~~~~~~~~~~~~~~~~~~~~~^
m,
^^
...<2 lines>...
**kwargs,
^^^^^^^^^
)
^
File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 1221, in _generate_with_cache
result = self._generate(
messages, stop=stop, run_manager=run_manager, **kwargs
)
File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/langchain_openai/chat_models/base.py", line 1356, in _generate
raise e
File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/langchain_openai/chat_models/base.py", line 1351, in _generate
raw_response = self.client.with_raw_response.create(**payload)
File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/openai/_legacy_response.py", line 364, in wrapped
return cast(LegacyAPIResponse[R], func(*args, **kwargs))
~~~~^^^^^^^^^^^^^^^^^
File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/opentelemetry/instrumentation/openai_v2/patch.py", line 73, in traced_method
_set_response_attributes(
~~~~~~~~~~~~~~~~~~~~~~~~^
span, result, logger, capture_content
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/opentelemetry/instrumentation/openai_v2/patch.py", line 353, in _set_response_attributes
span, GenAIAttributes.GEN_AI_RESPONSE_MODEL, result.model
^^^^^^^^^^^^
AttributeError: 'LegacyAPIResponse' object has no attribute 'model'
{
"resource_metrics": [
{
"resource": {
"attributes": {
"telemetry.sdk.language": "python",
"telemetry.sdk.name": "opentelemetry",
"telemetry.sdk.version": "1.39.0",
"telemetry.auto.version": "0.60b0",
"service.name": "unknown_service"
},
"schema_url": ""
},
"scope_metrics": [
{
"scope": {
"name": "opentelemetry.instrumentation.httpx",
"version": "0.60b0",
"schema_url": "https://opentelemetry.io/schemas/1.11.0",
"attributes": null
},
"metrics": [
{
"name": "http.client.duration",
"description": "measures the duration of the outbound HTTP request",
"unit": "ms",
"data": {
"data_points": [
{
"attributes": {
"http.method": "POST",
"http.scheme": "http",
"http.status_code": 200
},
"start_time_unix_nano": 1764920016981436262,
"time_unix_nano": 1764920016997010466,
"count": 1,
"sum": 26319,
"bucket_counts": [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
1
],
"explicit_bounds": [
0.0,
5.0,
10.0,
25.0,
50.0,
75.0,
100.0,
250.0,
500.0,
750.0,
1000.0,
2500.0,
5000.0,
7500.0,
10000.0
],
"min": 26319,
"max": 26319,
"exemplars": [
{
"filtered_attributes": {},
"value": 26319,
"time_unix_nano": 1764920016981230008,
"span_id": 6840241758990152610,
"trace_id": 257036448915085777232406887536521970834
}
]
}
],
"aggregation_temporality": 2
}
}
],
"schema_url": "https://opentelemetry.io/schemas/1.11.0"
},
{
"scope": {
"name": "opentelemetry.instrumentation.openai_v2",
"version": "",
"schema_url": "https://opentelemetry.io/schemas/1.28.0",
"attributes": null
},
"metrics": [
{
"name": "gen_ai.client.operation.duration",
"description": "GenAI operation duration",
"unit": "s",
"data": {
"data_points": [
{
"attributes": {
"gen_ai.operation.name": "chat",
"gen_ai.system": "openai",
"gen_ai.request.model": "Qwen3-32B",
"error.type": "AttributeError",
"server.address": "10.19.154.22",
"server.port": 28009
},
"start_time_unix_nano": 1764920016982660778,
"time_unix_nano": 1764920016997010466,
"count": 1,
"sum": 26.327739343978465,
"bucket_counts": [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
1,
0,
0
],
"explicit_bounds": [
0.01,
0.02,
0.04,
0.08,
0.16,
0.32,
0.64,
1.28,
2.56,
5.12,
10.24,
20.48,
40.96,
81.92
],
"min": 26.327739343978465,
"max": 26.327739343978465,
"exemplars": [
{
"filtered_attributes": {},
"value": 26.327739343978465,
"time_unix_nano": 1764920016982597082,
"span_id": 1175045826866977486,
"trace_id": 257036448915085777232406887536521970834
}
]
}
],
"aggregation_temporality": 2
}
}
],
"schema_url": "https://opentelemetry.io/schemas/1.28.0"
}
],
"schema_url": ""
}
]
}
{
"name": "POST",
"context": {
"trace_id": "0xc15f6b5c4153a8b0930bc96f192d0492",
"span_id": "0x5eed6ba37e66b3a2",
"trace_state": "[]"
},
"kind": "SpanKind.CLIENT",
"parent_id": "0x104e99f431361ace",
"start_time": "2025-12-05T07:33:10.662257Z",
"end_time": "2025-12-05T07:33:36.981507Z",
"status": {
"status_code": "UNSET"
},
"attributes": {
"http.method": "POST",
"http.url": "http://10.19.154.22:28009/v1/chat/completions",
"http.status_code": 200
},
"events": [],
"links": [],
"resource": {
"attributes": {
"telemetry.sdk.language": "python",
"telemetry.sdk.name": "opentelemetry",
"telemetry.sdk.version": "1.39.0",
"telemetry.auto.version": "0.60b0",
"service.name": "unknown_service"
},
"schema_url": ""
}
}
{
"name": "chat Qwen3-32B",
"context": {
"trace_id": "0xc15f6b5c4153a8b0930bc96f192d0492",
"span_id": "0x104e99f431361ace",
"trace_state": "[]"
},
"kind": "SpanKind.CLIENT",
"parent_id": null,
"start_time": "2025-12-05T07:33:10.654676Z",
"end_time": "2025-12-05T07:33:36.982552Z",
"status": {
"status_code": "ERROR",
"description": "'LegacyAPIResponse' object has no attribute 'model'"
},
"attributes": {
"gen_ai.operation.name": "chat",
"gen_ai.system": "openai",
"gen_ai.request.model": "Qwen3-32B",
"gen_ai.request.temperature": 0.7,
"server.address": "10.19.154.22",
"server.port": 28009,
"error.type": "AttributeError"
},
"events": [],
"links": [],
"resource": {
"attributes": {
"telemetry.sdk.language": "python",
"telemetry.sdk.name": "opentelemetry",
"telemetry.sdk.version": "1.39.0",
"telemetry.auto.version": "0.60b0",
"service.name": "unknown_service"
},
"schema_url": ""
}
}
{
"name": "ChatOpenAI",
"context": {
"trace_id": "0x1383cf223b8cd07aa1c35c1a6ce710e1",
"span_id": "0xefb9f98f7d9bb248",
"trace_state": "[]"
},
"kind": "SpanKind.INTERNAL",
"parent_id": null,
"start_time": "2025-12-05T07:33:10.653257Z",
"end_time": "2025-12-05T07:33:36.990239Z",
"status": {
"status_code": "ERROR",
"description": "AttributeError("'LegacyAPIResponse' object has no attribute 'model'")Traceback (most recent call last):\n\n\n File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 927, in generate\n self._generate_with_cache(\n ~~~~~~~~~~~~~~~~~~~~~~~~~^\n m,\n ^^\n ...<2 lines>...\n **kwargs,\n ^^^^^^^^^\n )\n ^\n\n\n File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 1221, in _generate_with_cache\n result = self._generate(\n messages, stop=stop, run_manager=run_manager, **kwargs\n )\n\n\n File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/langchain_openai/chat_models/base.py", line 1356, in _generate\n raise e\n\n\n File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/langchain_openai/chat_models/base.py", line 1351, in _generate\n raw_response = self.client.with_raw_response.create(**payload)\n\n\n File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/openai/_legacy_response.py", line 364, in wrapped\n return cast(LegacyAPIResponse[R], func(*args, **kwargs))\n ~~~~^^^^^^^^^^^^^^^^^\n\n\n File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/opentelemetry/instrumentation/openai_v2/patch.py", line 73, in traced_method\n _set_response_attributes(\n ~~~~~~~~~~~~~~~~~~~~~~~~^\n span, result, logger, capture_content\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n )\n ^\n\n\n File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/opentelemetry/instrumentation/openai_v2/patch.py", line 353, in _set_response_attributes\n span, GenAIAttributes.GEN_AI_RESPONSE_MODEL, result.model\n ^^^^^^^^^^^^\n\n\nAttributeError: 'LegacyAPIResponse' object has no attribute 'model'"
},
"attributes": {
"gen_ai.span.kind": "llm",
"input.value": "{"prompts": ["System: You are a helpful assistant that translates English to French.\nHuman: Translate this sentence from English to French. I love programming."]}",
"input.mime_type": "application/json",
"output.value": "{"generations": [[]], "llm_output": null, "run": null, "type": "LLMResult"}",
"output.mime_type": "application/json",
"gen_ai.prompt.0.content": "System: You are a helpful assistant that translates English to French.\nHuman: Translate this sentence from English to French. I love programming.",
"gen_ai.request.model": "Qwen3-32B",
"metadata": "{"model": "Qwen3-32B", "ls_provider": "openai", "ls_model_name": "Qwen3-32B", "ls_model_type": "chat", "ls_temperature": 0.7, "ls_max_tokens": 1000}"
},
"events": [
{
"name": "exception",
"timestamp": "2025-12-05T07:33:36.987783Z",
"attributes": {
"exception.type": "AttributeError",
"exception.message": "'LegacyAPIResponse' object has no attribute 'model'",
"exception.stacktrace": "Traceback (most recent call last):\n File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 927, in generate\n self._generate_with_cache(\n ~~~~~~~~~~~~~~~~~~~~~~~~~^\n m,\n ^^\n ...<2 lines>...\n **kwargs,\n ^^^^^^^^^\n )\n ^\n File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 1221, in _generate_with_cache\n result = self._generate(\n messages, stop=stop, run_manager=run_manager, **kwargs\n )\n File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/langchain_openai/chat_models/base.py", line 1356, in _generate\n raise e\n File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/langchain_openai/chat_models/base.py", line 1351, in _generate\n raw_response = self.client.with_raw_response.create(**payload)\n File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/openai/_legacy_response.py", line 364, in wrapped\n return cast(LegacyAPIResponse[R], func(*args, **kwargs))\n ~~~~^^^^^^^^^^^^^^^^^\n File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/opentelemetry/instrumentation/openai_v2/patch.py", line 73, in traced_method\n _set_response_attributes(\n ~~~~~~~~~~~~~~~~~~~~~~~~^\n span, result, logger, capture_content\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n )\n ^\n File "/opt/jlj_file/loongsuit/env/loongsuit_env/lib/python3.13/site-packages/opentelemetry/instrumentation/openai_v2/patch.py", line 353, in _set_response_attributes\n span, GenAIAttributes.GEN_AI_RESPONSE_MODEL, result.model\n ^^^^^^^^^^^^\nAttributeError: 'LegacyAPIResponse' object has no attribute 'model'\n",
"exception.escaped": "False"
}
}
],
"links": [],
"resource": {
"attributes": {
"telemetry.sdk.language": "python",
"telemetry.sdk.name": "opentelemetry",
"telemetry.sdk.version": "1.39.0",
"telemetry.auto.version": "0.60b0",
"service.name": "unknown_service"
},
"schema_url": ""
}
}

Expected Result

...

Actual Result

....

Additional context

It can be temporarily repaired after uninstalling opentelemetry-instrumentation-openai-v2

pip uninstall opentelemetry-instrumentation-openai-v2

Would you like to implement a fix?

None

Tip

React with 👍 to help prioritize this issue. Please use comments to provide useful context, avoiding +1 or me too, to help us triage it. Learn more here.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workinggenaiThe genai label represents issues related to generative AI.instrumentaionThe instrumentation label represents issues related to instrumentation.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions