Replies: 1 comment 2 replies
-
Did you categorize with namespaces? |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
My requirement is to retrieve answers from specific documents in the vector store based on an input question, rather than searching through all the documents. To achieve this, I will pass two parameters as input: the question itself and a filter. The filter will determine which specific document in the vector store the question should check. Below is a code sample for clarification:
import os
from langchain import HuggingFaceHub
from langchain.retrievers import SelfQueryRetriever
os.environ["OPENAI_API_KEY"] = "API_KEY"
os.environ["HUGGINGFACEHUB_API_TOKEN"] = "API_TOKEN"
os.environ["OPENWEATHERMAP_API_KEY"] = "API_KEY"
from langchain.embeddings import HuggingFaceInstructEmbeddings
from langchain.vectorstores import FAISS,Milvus
from langchain.chains import ConversationalRetrievalChain
from langchain.memory import ConversationBufferMemory
from langchain.vectorstores import pinecone
from langchain.llms import OpenAI
embeddings = None # Global variable for embeddings
docsearch = None # Global variable for docsearc
def load_embeddings():
global embeddings
if embeddings is None:
embeddings = HuggingFaceInstructEmbeddings(
query_instruction="Represent the query for retrieval: "
)
@st.cache_resource
def prep_qa_langchain(truck_model, metadata_field_info=None, document_content_description=None):
# Making the embeddings
# Load up the FAISS index
global docsearch
print(truck_model)
if truck_model is Not None:
docsearch = FAISS.load_local("faiss_index", embeddings)
doc=docsearch.getattribute('dict')
doc = docsearch.similarity_search("")
def run_conversational_chain(question):
return truck_manual_rqa.run(question)
return run_conversational_chain
Beta Was this translation helpful? Give feedback.
All reactions