You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Addresses the "no embeddings found" and "API Connection error" issues.
Specifically issues:
[1546](#1546),
[1526](#1526),
[1512](#1512),
[1496](#1496)
Users have reported that they cannot generate a Testset because they get
API connection errors, or their knowledge graph does not have the
embeddings. This is due to the use of the default LLMs and Embedding
models via llm_factory and embedding_factory. The errors are occuring
becuase the users do not have OpenAI credentials in their environment
because they are using different models in their workflow.
Issue to solve is to prevent the default_transforms function from using
the llm_factory by forcing the user to add both an embedding model and
llm model when instantiating TestsetGenerator.
1. Added `embedding_model` as an attribute to `TestsetGenerator`.
2. Added `embedding_model: LangchainEmbeddings` as a parameter to
`TestsetGenerator.from_langchain`
3. Changed the return from `TestsetGenerator.from_langchain` to `return
cls(LangchainLLMWrapper(llm),
LangchainEmbeddingsWrapper(embedding_model), knowledge_graph)`
4. Added both an `llm` and `embedding_model` parameter to
`TestsetGenerator.generate_with_langchain_docs`
0 commit comments