-
-
Notifications
You must be signed in to change notification settings - Fork 6.8k
Open
Labels
LLM APIVarious issues caused by large model APIsVarious issues caused by large model APIs
Description
错误信息
研究过程中发生错误: Error code: 400 - {'code': 20015, 'message': 'length of prompt_tokens (4560) must be less than max_seq_len (4096).', 'data': None}
错误详情
Traceback (most recent call last):
File "/app/SingleEngineApp/media_engine_streamlit_app.py", line 182, in execute_research
agent._initial_search_and_summary(i)
File "/app/SingleEngineApp/../MediaEngine/agent.py", line 285, in _initial_search_and_summary
self.state = self.first_summary_node.mutate_state(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/SingleEngineApp/../MediaEngine/nodes/summary_node.py", line 199, in mutate_state
raise e
File "/app/SingleEngineApp/../MediaEngine/nodes/summary_node.py", line 185, in mutate_state
summary = self.run(input_data, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/SingleEngineApp/../MediaEngine/nodes/summary_node.py", line 116, in run
raise e
File "/app/SingleEngineApp/../MediaEngine/nodes/summary_node.py", line 103, in run
response = self.llm_client.stream_invoke_to_string(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/utils/retry_helper.py", line 89, in wrapper
raise e
File "/app/utils/retry_helper.py", line 77, in wrapper
result = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/app/SingleEngineApp/../MediaEngine/llms/base.py", line 151, in stream_invoke_to_string
for chunk in self.stream_invoke(system_prompt, user_prompt, **kwargs):
File "/app/SingleEngineApp/../MediaEngine/llms/base.py", line 134, in stream_invoke
raise e
File "/app/SingleEngineApp/../MediaEngine/llms/base.py", line 120, in stream_invoke
stream = self.client.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/_utils/_utils.py", line 286, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/resources/chat/completions/completions.py", line 1192, in create
return self._post(
^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/_base_client.py", line 1259, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/openai/_base_client.py", line 1047, in request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'code': 20015, 'message': 'length of prompt_tokens (4560) must be less than max_seq_len (4096).', 'data': None}
环境信息
- 应用: Media Engine Streamlit App
- 时间: 2026-01-13 07:39:11
Metadata
Metadata
Assignees
Labels
LLM APIVarious issues caused by large model APIsVarious issues caused by large model APIs