Replies: 2 comments 4 replies
-
anyone already solved this issue? |
Beta Was this translation helpful? Give feedback.
0 replies
-
This is how I fixed this: set chunk_overlap_ratio= 0.1, and delete: max_chunk_overlap option. For example
... |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Am getting an the following error
ValueError Traceback (most recent call last)
in <cell line: 1>()
----> 1 construct_index('/content/meditations.mb.txt')
1 frames
/usr/local/lib/python3.10/dist-packages/llama_index/indices/prompt_helper.py in init(self, context_window, num_output, chunk_overlap_ratio, chunk_size_limit, tokenizer, separator, max_input_size, embedding_limit, max_chunk_overlap)
70 self.chunk_overlap_ratio = chunk_overlap_ratio
71 if self.chunk_overlap_ratio > 1.0 or self.chunk_overlap_ratio < 0.0:
---> 72 raise ValueError("chunk_overlap_ratio must be a float between 0. and 1.")
73 self.chunk_size_limit = chunk_size_limit
74
ValueError: chunk_overlap_ratio must be a float between 0. and 1.
for the code
!pip install langchain
!pip install llama_index
from llama_index import SimpleDirectoryReader, GPTListIndex, GPTVectorStoreIndex ,LLMPredictor, PromptHelper
from langchain import OpenAI
import sys
import os
def ask_bot(question):
storage_context=StorageContext.from_defaults(persist_dir='input_index')
index=load_index_from_storage(storage_context)
query_engine=index.as_query_engine()
response=query_engine.query(question)
return response
!wget http://classics.mit.edu/Antoninus/meditations.mb.txt
construct_index('/content/meditations.mb.txt')
The above code should actually create an json file from which I can access the data but its not being created .
Beta Was this translation helpful? Give feedback.
All reactions