Make LLM directly use the response of a GET API from the API Loader without the need of a Vector Store #3277
Unanswered
fabiopipitone
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi there,
I'd like to extract some documents from my vector store (let's say Elasticsearch) via custom API (so I can build the body of the query) and use the response to directly instruct the LLM prompt.
Let's say I have an ES index "myrecipes" with docs containing some recipes and structured as follows:
Let's say I query the ES index via the API Loader and extract the top 3 recipes. Then I want to pass it to a Prompt Template (or Chain Prompt Template) that I'm building to instruct the LLM as a food critic that has to make a review of each of the retrieved recipes.
How can I use the documents retrieved by the API Loader in the Prompt Template? Cause the output of the API Loader is of Document type and it seems to only match the Document Store input.
Here's where I'm stuck

Any ideas?
Thank you 🙏
P.S. actually, what I'd need to do is to let the user write an ingredient, make an API GET request to elasticsearch using that ingredient as query string, take the top 3 results and then proceed as mentioned above, but let's take one step at a time :)
Beta Was this translation helpful? Give feedback.
All reactions