how to handle context_size token error #10078
Unanswered
lakshminarayanannn
asked this question in
Q&A
Replies: 1 comment 1 reply
-
could you please help with this problem . |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Even after setting or increasing the llm context window and max_new_token to max extent in mistral 7b instruct
I get this error
can anyone help me to resolve this.
ValueError Traceback (most recent call last)
Cell In[8], line 1
----> 1 res = kg_index.as_query_engine().query(query)
2 print(res)
File ~/LLM_DEV/llm/lib64/python3.9/site-packages/llama_index/core/base_query_engine.py:40, in BaseQueryEngine.query(self, str_or_query_bundle)
38 if isinstance(str_or_query_bundle, str):
39 str_or_query_bundle = QueryBundle(str_or_query_bundle)
---> 40 return self._query(str_or_query_bundle)
File ~/LLM_DEV/llm/lib64/python3.9/site-packages/llama_index/query_engine/retriever_query_engine.py:172, in RetrieverQueryEngine._query(self, query_bundle)
168 with self.callback_manager.event(
169 CBEventType.QUERY, payload={EventPayload.QUERY_STR: query_bundle.query_str}
170 ) as query_event:
171 nodes = self.retrieve(query_bundle)
--> 172 response = self._response_synthesizer.synthesize(
173 query=query_bundle,
174 nodes=nodes,
175 )
177 query_event.on_end(payload={EventPayload.RESPONSE: response})
179 return response
File ~/LLM_DEV/llm/lib64/python3.9/site-packages/llama_index/response_synthesizers/base.py:168, in BaseSynthesizer.synthesize(self, query, nodes, additional_source_nodes, **response_kwargs)
163 query = QueryBundle(query_str=query)
...
152 " not non-negative."
153 )
154 return context_size_tokens
ValueError: Calculated available context size -22 was not non-negative.
Beta Was this translation helpful? Give feedback.
All reactions