Problems with with_structured_output with CustomChatModel #26942
Unanswered
santiagovasquez1
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
I am trying to convert the output of the llm invocation to the IntentModel type, but when executing the invocation I get the error Error: 'NoneType' object has no attribute 'with_structured_output'.
What should I do to be able to execute the method correctly?
System Info
OS: Linux
OS Version: #1 SMP Fri Sep 25 19:48:47 UTC 2020
Python Version: 3.12.4 | packaged by conda-forge | (main, Jun 17 2024, 10:23:07) [GCC 12.3.0]
langchain_core: 0.3.5
langchain: 0.3.0
langchain_community: 0.3.0
langsmith: 0.1.128
langchain_huggingface: 0.1.0
langchain_text_splitters: 0.3.0
Beta Was this translation helpful? Give feedback.
All reactions