-
Notifications
You must be signed in to change notification settings - Fork 28
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
�[36m(<lambda> pid=1543323, ip=172.16.4.97)�[0m | File "/tmp/ray/session_2025-06-05_07-19-04_642991_1676/runtime_resources/pip/42bb406fe48ce446a6ce2a9cea155757cb9174a4/virtualenv/lib/python3.12/site-packages/llama_index/core/response_synthesizers/compact_and_refine.py", line 22, in aget_response
�[36m(<lambda> pid=1543323, ip=172.16.4.97)�[0m | compact_texts = self._make_compact_text_chunks(query_str, text_chunks)
�[36m(<lambda> pid=1543323, ip=172.16.4.97)�[0m | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
�[36m(<lambda> pid=1543323, ip=172.16.4.97)�[0m | File "/tmp/ray/session_2025-06-05_07-19-04_642991_1676/runtime_resources/pip/42bb406fe48ce446a6ce2a9cea155757cb9174a4/virtualenv/lib/python3.12/site-packages/llama_index/core/response_synthesizers/compact_and_refine.py", line 57, in _make_compact_text_chunks
�[36m(<lambda> pid=1543323, ip=172.16.4.97)�[0m | return self._prompt_helper.repack(max_prompt, text_chunks, llm=self._llm)
�[36m(<lambda> pid=1543323, ip=172.16.4.97)�[0m | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
�[36m(<lambda> pid=1543323, ip=172.16.4.97)�[0m | File "/tmp/ray/session_2025-06-05_07-19-04_642991_1676/runtime_resources/pip/42bb406fe48ce446a6ce2a9cea155757cb9174a4/virtualenv/lib/python3.12/site-packages/llama_index/core/indices/prompt_helper.py", line 302, in repack
�[36m(<lambda> pid=1543323, ip=172.16.4.97)�[0m | text_splitter = self.get_text_splitter_given_prompt(
�[36m(<lambda> pid=1543323, ip=172.16.4.97)�[0m | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
�[36m(<lambda> pid=1543323, ip=172.16.4.97)�[0m | File "/tmp/ray/session_2025-06-05_07-19-04_642991_1676/runtime_resources/pip/42bb406fe48ce446a6ce2a9cea155757cb9174a4/virtualenv/lib/python3.12/site-packages/llama_index/core/indices/prompt_helper.py", line 256, in get_text_splitter_given_prompt
�[36m(<lambda> pid=1543323, ip=172.16.4.97)�[0m | chunk_size = self._get_available_chunk_size(
�[36m(<lambda> pid=1543323, ip=172.16.4.97)�[0m | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
�[36m(<lambda> pid=1543323, ip=172.16.4.97)�[0m | File "/tmp/ray/session_2025-06-05_07-19-04_642991_1676/runtime_resources/pip/42bb406fe48ce446a6ce2a9cea155757cb9174a4/virtualenv/lib/python3.12/site-packages/llama_index/core/indices/prompt_helper.py", line 238, in _get_available_chunk_size
�[36m(<lambda> pid=1543323, ip=172.16.4.97)�[0m | available_context_size = self._get_available_context_size(num_prompt_tokens)
�[36m(<lambda> pid=1543323, ip=172.16.4.97)�[0m | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
�[36m(<lambda> pid=1543323, ip=172.16.4.97)�[0m | File "/tmp/ray/session_2025-06-05_07-19-04_642991_1676/runtime_resources/pip/42bb406fe48ce446a6ce2a9cea155757cb9174a4/virtualenv/lib/python3.12/site-packages/llama_index/core/indices/prompt_helper.py", line 158, in _get_available_context_size
�[36m(<lambda> pid=1543323, ip=172.16.4.97)�[0m | raise ValueError(
�[36m(<lambda> pid=1543323, ip=172.16.4.97)�[0m | ValueError: Calculated available context size -4019 was not non-negative.
To Reproduce
Run make check and then run the provide command.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working