Skip to content

Conversation

@sanjeed5
Copy link
Contributor

@sanjeed5 sanjeed5 commented Nov 14, 2025

Issue Link / Problem Description

Updates customize_models.md guide to use modern llm_factory and embedding_factory APIs instead of legacy LangChain wrapper pattern.

Changes Made

  • Updated Azure OpenAI example to use llm_factory with litellm provider
  • Updated Google Vertex AI example to use llm_factory with litellm provider (major simplification)
  • Updated AWS Bedrock example to use llm_factory with litellm provider
  • Fixed Azure model naming to use deployment names

Testing

How to Test

  • Manual testing steps:
    1. Azure OpenAI, Google Vertex, AWS Bedrock: Syntax validated against LiteLLM and Instructor documentation
    2. No actual API calls made for cloud providers (no environment setup)
    3. Code patterns verified against official docs

References

Replace legacy LangChain wrapper pattern with llm_factory and
embedding_factory using LiteLLM provider for all cloud providers.

Changes:
Updated examples
- Removed LangChain dependencies (langchain_openai, langchain_google_vertexai,
  langchain_aws) in favor of litellm
- Fixed Azure model naming to use deployment names (azure/{deployment-name})
- Removed unnecessary interface="modern" parameter (auto-detected)

Benefits:
- Consistent pattern across all cloud providers
- Simpler code with fewer dependencies
- No LangChain required
- Uses modern Ragas factory APIs with Instructor and LiteLLM

All syntax verified against LiteLLM and Instructor documentation.
@dosubot dosubot bot added the size:L This PR changes 100-499 lines, ignoring generated files. label Nov 14, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size:L This PR changes 100-499 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant