Replies: 1 comment 1 reply
-
Hey @FaYMan2! 👋 I'm here to help you with any bugs, questions, or contributions you have in mind. Let's squash those bugs together! To resolve the error related to the index name when creating a history-aware conversational chain using LangChain with your own embedding function, ensure that the index name, node label, and embedding dimensions are correctly set and match the existing index configuration. The error typically occurs if there is a mismatch between the provided embedding function's dimensions and the existing vector index dimensions. Here is a relevant code snippet that checks and handles the index name and dimensions: const embeddingDimension = await store.retrieveExistingIndex();
if (!embeddingDimension) {
await store.createNewIndex();
} else if (store.embeddingDimension !== embeddingDimension) {
throw new Error(
`Index with name "${store.indexName}" already exists. The provided embedding function and vector index dimensions do not match.
Embedding function dimension: ${store.embeddingDimension}
Vector index dimension: ${embeddingDimension}`
);
} This code ensures that:
Additionally, when setting up the conversational retrieval chain, ensure that the import { ChatAnthropic } from "@langchain/anthropic";
import { ChatPromptTemplate, MessagesPlaceholder } from "@langchain/core/prompts";
import { BaseMessage } from "@langchain/core/messages";
import { createStuffDocumentsChain } from "langchain/chains/combine_documents";
import { createHistoryAwareRetriever } from "langchain/chains/history_aware_retriever";
import { createRetrievalChain } from "langchain/chains/retrieval";
const retriever = ...your retriever;
const llm = new ChatAnthropic();
// Contextualize question
const contextualizeQSystemPrompt = `
Given a chat history and the latest user question
which might reference context in the chat history,
formulate a standalone question which can be understood
without the chat history. Do NOT answer the question, just
reformulate it if needed and otherwise return it as is.`;
const contextualizeQPrompt = ChatPromptTemplate.fromMessages([
["system", contextualizeQSystemPrompt],
new MessagesPlaceholder("chat_history"),
["human", "{input}"],
]);
const historyAwareRetriever = await createHistoryAwareRetriever({
llm,
retriever,
rephrasePrompt: contextualizeQPrompt,
});
// Answer question
const qaSystemPrompt = `
You are an assistant for question-answering tasks. Use
the following pieces of retrieved context to answer the
question. If you don't know the answer, just say that you
don't know. Use three sentences maximum and keep the answer
concise.
\n\n
{context}`;
const qaPrompt = ChatPromptTemplate.fromMessages([
["system", qaSystemPrompt],
new MessagesPlaceholder("chat_history"),
["human", "{input}"],
]);
const questionAnswerChain = await createStuffDocumentsChain({
llm,
prompt: qaPrompt,
});
const ragChain = await createRetrievalChain({
retriever: historyAwareRetriever,
combineDocsChain: questionAnswerChain,
});
// Usage:
const chat_history: BaseMessage[] = [];
const response = await ragChain.invoke({
chat_history,
input: "...",
}); Make sure that your embedding function's dimensions align with the existing index configuration to avoid this error [1][2]. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
I am trying to create a history aware conversational chain but instead of using langchain pinecone vectorStore for embedding the data I used my own embedding function --
The difference in vectors are -
PINECONE VECTOR_Store
ID
a1f6f8f6-13b0-4af6-b7f8-4ca98e25677f
VALUES
-0.108577356, -0.0476923771, 0.0842147246, -0.339654893, 0.062151365, -0.0347989909, -0.0412855819, -0.119942054, ............
METADATA
blobType: "application/pdf"
loc.lines.from: 26
loc.lines.to: 34
loc.pageNumber: 1
pdf.info.CreationDate: "D:20240602113734+05'30'"
pdf.info.Creator: "wkhtmltopdf 0.12.6"
pdf.info.IsAcroFormPresent: false
pdf.info.IsXFAPresent: false
pdf.info.PDFFormatVersion: "1.4"
pdf.info.Producer: "Qt 4.8.7"
pdf.info.Title: "ONE"
pdf.totalPages: 4
pdf.version: "1.10.100"
source: "blob"
text: "3.1 On 4th December 2016, the prosecutrix lodged a First Information Report (for short,\n“FIR”) before the
MY vectors -
ID
batch2vector24
VALUES
-0.196326926, -0.0750241131, 0.199734464, -0.167314678, 0.110599361, 0.248793736, 0.0464999564, -0.0746621042, .....
METADATA
text: "personal liberty not under the Act or rules and orders made thereunder but in contravention\nt
error I was getting -
[RequiredError: Required parameter requestParameters.indexName was null or undefined when calling describeIndex.] {
field: 'indexName',
name: 'RequiredError'
}
System Info
[email protected] | MIT | deps: 16 | versions: 281
Typescript bindings for langchain
https://github.com/langchain-ai/langchainjs/tree/main/langchain/
keywords: llm, ai, gpt3, chain, prompt, prompt engineering, chatgpt, machine learning, ml, openai, embeddings, vectorstores
dist
.tarball: https://registry.npmjs.org/langchain/-/langchain-0.2.9.tgz
.shasum: 1341bdd7166f4f6da0b9337f363e409a79523dbb
.integrity: sha512-iZ0l7BDVfoifqZlDl1gy3JP5mIdhYjWiToPlDnlmfHD748cw3okvF0gZo0ruT4nbftnQcaM7JzPUiNC43UPfgg==
.unpackedSize: 4.0 MB
dependencies:
@langchain/core: >=0.2.11 <0.3.0 jsonpointer: ^5.0.1 uuid: ^10.0.0
@langchain/openai: >=0.1.0 <0.3.0 langchainhub: ~0.0.8 yaml: ^2.2.1
@langchain/textsplitters: ~0.0.0 langsmith: ~0.1.30 zod-to-json-schema: ^3.22.3
binary-extensions: ^2.2.0 ml-distance: ^4.0.0 zod: ^3.22.4
js-tiktoken: ^1.0.12 openapi-types: ^12.1.3
js-yaml: ^4.1.0 p-retry: 4
maintainers:
dist-tags:
latest: 0.2.9 next: 0.2.3-rc.0
published yesterday by jacoblee93 [email protected]
Beta Was this translation helpful? Give feedback.
All reactions