Skip to content

refactor: simplify langchain/llm configs #150

@navalnica

Description

@navalnica

StatGPT Backend version

latest

What is the problem this feature will solve?

there is a separation of concerns issue between langchain settings and llm/embedding models settings:

  • langchain settings allow to set default llm/embedding params via envvars (e.g. temperature)
  • this does not introduce much value (we likely don't have a need to control such params via envvars, since we've moved to comprehensive llm models configs)
  • however, it makes settings complex and unclear. it's much clearer to control these params using llm/embedding models configs

What is the proposed feature or solution?

  • remove following LangChainSettings fields:
    • default_model (? could be a large refactoring and might be not worth it. but I think it's a good idea to explicitly specify the LLM model to use)
    • embedding_default_model (??? even larger refactoring needed - probably keep it as is)
    • default_temperature
    • default_seed
    • remove corresponding envvars, upd envvars markdown listing
  • set api_version in langchain config, remove it from llm/embeddings model config

after removing envvars we'll need to update

  • envvars readme files
  • helms
  • .env.template

What alternatives have you considered?

keep as is

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions