Replies: 1 comment
-
🤖 Hello, The error message you're seeing is indicating that the The def batch(
self,
inputs: List[Input],
config: Optional[Union[RunnableConfig, List[RunnableConfig]]] = None,
**kwargs: Optional[Any],
) -> List[Output]: It seems like the error might be coming from the way you're trying to bind the Here's an example of how you might do this: from langchain.output_parsers.openai_functions import JsonKeyOutputFunctionsParser
functions = [
{
"name": "hypothetical_questions",
"description": "Generate hypothetical questions",
"parameters": {
"type": "object",
"properties": {
"questions": {
"type": "array",
"items": {
"type": "string"
},
},
},
"required": ["questions"]
}
}
]
chat_openai = ChatOpenAI(max_retries=0, model="gpt-3.5-turbo")
chat_openai.bind(functions=functions)
chain = (
{"doc": lambda x: x.page_content}
| ChatPromptTemplate.from_template("Generate a list of 3 hypothetical questions that the below document could be used to answer:\n\n{doc}")
| chat_openai.function_call({"name": "hypothetical_questions"})
| JsonKeyOutputFunctionsParser(key_name="questions")
)
hypothetical_questions = chain.batch(docs, {"max_concurrency": 5}) In this example, I've created an instance of Please try this and let me know if it resolves your issue. If not, could you please provide more details about the context in which the error is occurring? Specifically, it would be helpful to know the exact line of code that's causing the error. Best, Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
TypeError Traceback (most recent call last)
Cell In[114], line 1
----> 1 hypothetical_questions = chain.batch(docs)
3 # The vectorstore to use to index the child chunks
4 vectorstore = Chroma(
5 collection_name="hypo-questions",
6 embedding_function=OpenAIEmbeddings()
7 )
File, in batch(self, inputs, config, max_concurrency)
848 try:
849 for step in self.steps:
850 inputs = step.batch(
--> 851 inputs,
852 [
853 # each step a child run of the corresponding root run
854 patch_config(config, callbacks=rm.get_child())
855 for rm, config in zip(run_managers, configs)
856 ],
857 )
858 # finish the root runs
859 except (KeyboardInterrupt, Exception) as e:
File, in batch(self, inputs, config, max_concurrency)
...
1278 ]
1280 # Wrap in a coroutine to satisfy linter
1281 async def get_next_chunk(generator: AsyncIterator) -> Optional[Output]:
TypeError: Runnable.batch() got an unexpected keyword argument 'functions'
Beta Was this translation helpful? Give feedback.
All reactions