How Can We Chain PromptTemplate with PromptTemplate #21481
DragonMengLong
announced in
Ideas
Replies: 1 comment
-
You can just pass a function through RunnableLambda that converts the PromptValue from first element into string before passing it to the second element. You can find a very similar example in the note that I published : https://medium.com/p/925aa077227e. If you check out Example 1, its the same problem statement. Merging the output of one prompt template with another. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Feature request
I am wondering whether we chain the PromptTemplate together just like how we chain the PromptTemplate and LLM using "|" when i need to use one PromptTemplate to wrap the other. It's different with prompt composition or partial prompt templates.
like this "prompt_template | prompt_template | prompt_template | llm"
Motivation
I think it will be more convenient if we can just simply chain two PromptTemplate together specifically when I want to like change all my previous PromptTemplate code, it will be easier to maintain the code. 🙂
Proposal
I find we can't directly chain two PromptTemplate together like " prompt_template | prompt_template" because the previous one return a PromptValue and the after one takes a str, and we can't simply specify in the template when we instantiate the after PromptTemplate like "{input.text}" because if I do that, the input_variables will become "input.text" rather than "input"
So i write a PromptGlue inheriting the Runnable and implement the invoke methord. And Now we can chain two PromptTemplate together like this "fewshot_prompt_template | prompt_glue | prompt_template | llm".
Example Code
Beta Was this translation helpful? Give feedback.
All reactions