-
Checked other resources
Commit to Help
Example Codeclass AnswerWithJustification(pydantic.v1.BaseModel):
"""An answer to the user question along with justification for the answer."""
answer: str
justification: str
sample_prompt = "What weighs more a pound of bricks or a pound of feathers"
...
# you fill in this part
...
azure_openai_result = azure_openai_configured_structured_llm.invoke(sample_prompt)
print(f"{azure_openai_result=}")
bedrock_claude_result = bedrock_claude_configured_structured_llm.invoke(sample_prompt)
print(f"{bedrock_claude_result=}) DescriptionI'd like an example of how, using the new langchain.chat_models.base.init_chat_model function, I can make a chain that has an LLM configured to use structured output and which lets me configure at runtime whether to use System InfoSystem Information
Package Information
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
To switch between from langchain.chat_models import init_chat_model
import pydantic
class AnswerWithJustification(pydantic.v1.BaseModel):
"""An answer to the user question along with justification for the answer."""
answer: str
justification: str
sample_prompt = "What weighs more a pound of bricks or a pound of feathers"
# Initialize AzureChatOpenAI model
azure_openai_configured_structured_llm = init_chat_model(
"gpt-4o",
model_provider="AzureChatOpenAI",
temperature=0.7,
azure_deployment="your_deployment_name", # Replace with your actual deployment name
azure_endpoint="https://your-endpoint.azure.com/" # Replace with your actual endpoint
)
# Initialize BedrockChat model
bedrock_claude_configured_structured_llm = init_chat_model(
"amazon.some-model",
model_provider="BedrockChat",
temperature=0.7
)
# Use the models
azure_openai_result = azure_openai_configured_structured_llm.invoke(sample_prompt)
print(f"{azure_openai_result=}")
bedrock_claude_result = bedrock_claude_configured_structured_llm.invoke(sample_prompt)
print(f"{bedrock_claude_result=}") In this example:
This setup allows you to dynamically choose between |
Beta Was this translation helpful? Give feedback.
What I'm after is something that can switch between azure and bedrock structured output LLMs using nothing but configuration. I figured out how to do that on my own after some experimentation. Here's a passing pytest for me.