Skip to content
Discussion options

You must be logged in to vote

What I'm after is something that can switch between azure and bedrock structured output LLMs using nothing but configuration. I figured out how to do that on my own after some experimentation. Here's a passing pytest for me.

def test_init_chat_model_basic_example():
    from langchain.chat_models import init_chat_model
    import pydantic

    class AnswerWithJustification(pydantic.v1.BaseModel):
        """An answer to the user question along with justification for the answer."""

        answer: str
        justification: str

    sample_prompt = "What weighs more a pound of bricks or a pound of feathers"

    azure_openai_gpt_35_turbo_deployment_name = "<YOUR DEPLOYMENT NAME HERE>"
    b…

Replies: 1 comment 3 replies

Comment options

You must be logged in to vote
3 replies
@codekiln
Comment options

@dosubot
Comment options

@codekiln
Comment options

Answer selected by codekiln
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
1 participant