Skip to content

sourcefuse/llm-chat-component

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

27 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLM Chat Component

A Loopack4 based component to integrate a basic Langgraph.js based endpoint in your application which can use any tool that you register using the provided decorator.

Installation

Install AIIntegrationsComponent using npm;

$ [npm install | yarn add] lb4-llm-chat-component

Basic Usage

Configure and load the AIIntegrations component in the application constructor as shown below.

import {AiIntegrationsComponent} from 'lb4-llm-chat-component';
// ...
export class MyApplication extends BootMixin(
  ServiceMixin(RepositoryMixin(RestApplication)),
) {
  constructor(options: ApplicationConfig = {}) {
    // could be any LLM provider or your own LangGraph supported LLM provider
    // you can also have different LLM for different LLM type - cheap, smart and multimodal
    this.bind(AiIntegrationBindings.CheapLLM).toProvider(Ollama);
    this.bind(AiIntegrationBindings.SmartLLM).toProvider(Ollama);
    this.bind(AiIntegrationBindings.FileLLM).toProvider(Ollama);
    // configuration
    this.bind(AiIntegrationBindings.Config).to({
      // if not set to true, it will bind a ARC based sequence from @sourceloop/core with authentication and authorization
      useCustomSequence: true,
      // if not set to false, it will bind the core component from @sourceloop/core by default
      mountCore: false
      // if not set to false, it will bind @sourceloop/file-utils component with defaults config
      mountFileUtils: false
    });
    this.component(AiIntegrationsComponent);

    // ...
  }
  // ...
}

LLM Providers

Ollama

To need the Ollama based models, install the package - @langchain/ollama and update your application.ts -

this.bind(AiIntegrationBindings.CheapLLM).toProvider(Ollama);
this.bind(AiIntegrationBindings.SmartLLM).toProvider(Ollama);
this.bind(AiIntegrationBindings.FileLLM).toProvider(Ollama);

Gemini

To use the Gemini based models, install the package - @google/generative-ai and @langchain/google-genai and update your application.ts -

this.bind(AiIntegrationBindings.CheapLLM).toProvider(Gemini);
this.bind(AiIntegrationBindings.SmartLLM).toProvider(Gemini);
this.bind(AiIntegrationBindings.FileLLM).toProvider(Gemini);

Cerebras

To use the Cerebras based models, install the package - @langchain/cerebras and update your application.ts -

this.bind(AiIntegrationBindings.CheapLLM).toProvider(Cerebras);
this.bind(AiIntegrationBindings.SmartLLM).toProvider(Cerebras);
this.bind(AiIntegrationBindings.FileLLM).toProvider(Cerebras);

Anthropic

To use the Anthropic based models, install the package - @langchain/anthropic and update your application.ts -

this.bind(AiIntegrationBindings.CheapLLM).toProvider(Anthropic);
this.bind(AiIntegrationBindings.SmartLLM).toProvider(Anthropic);
this.bind(AiIntegrationBindings.FileLLM).toProvider(Anthropic);

Bedrock

To use the Bedrock based models, install the package - @langchain/aws and update your application.ts -

this.bind(AiIntegrationBindings.CheapLLM).toProvider(Bedrock);
this.bind(AiIntegrationBindings.SmartLLM).toProvider(Bedrock);
this.bind(AiIntegrationBindings.FileLLM).toProvider(Bedrock);

This binding would add an endpoint /generate in your service, that can answer user's query using the registered tools. By default, the module gives one set of tools through the DbQueryComponent

DbQueryComponent

This component provides a set of pre-build tools that can be plugged into any Loopback4 application -

  • generate-query - this tool can be used by the LLM to generate a database query based on user's prompt. It will return a DataSet instead of the query directly to the LLM.
  • improve-query - this tool takes a DataSet's id and a feedback or suggestion from the user, and uses it to modify the existing DataSet query.
  • ask-about-dataset - this tools takes a DataSet's id and a user prompt, and tries to answer user's question about the database query. Note that it can not run the query.

Query Generation Flow

---
title: Query Generation Flow
---
graph TD
    A[User Query Input] --> B[Query Generation Tool]
    B --> C[Datasets Store]
    C --> D[API Response]
    C --> DB
    B5 --> LLM

    subgraph "Query Generation Tool"
        B5[Context Compression]
        B1[Intent Analysis]
        B2[SQL Generation]
        B3[Semantic Validation]
        B4[Syntactic Validation]
        B6[Memory Retriever]
    end

    B6 --> Cache

    subgraph "External Systems"
        LLM[Model Connector]
        DB[Database]
        Cache[Vector Store]
    end


    B1 --> LLM
    B2 --> LLM
    B3 --> LLM
    B --> Cache

Loading

Providing Context

There are two ways to provide context to the LLM -

Global Context

Global context can be provided as an array of strings through a binding on key DbQueryAIExtensionBindings.GlobalContext. This binding can be a constant or come through a dynamic provider, something like this -

export class ChecksProvider implements Provider<string[]> {
  constructor(
    @repository(CurrencyRepository)
    private readonly currencyRepository: CurrencyRepository,
  ) {}
  async value(): Promise<string[]> {
    return [`Current date is ${new Date().toISOString().split('T')[0]}`];
  }
}

in application.ts -

...
this.bind(DbQueryAIExtensionBindings.GlobalContext).toProvider(ChecksProvider);
...

Model Context

Each model can have associated context in 3 ways -

@model({
  name: 'employees', // Use plural form for table name
  settings: {
    description: 'Model representing an employee in the system.',
    context: [
      'employee salary must be converted to USD, using the currency_id column and the exchange rate table',
    ],
  },
})
export class Employee extends Entity {
  ...
  @property({
    type: 'string',
    required: true,
    description: 'Name of the employee',
  })
  name: string;

  @property({
    type: 'string',
    required: true,
    description: 'Unique code for the employee, used for identification',
  })
  code: string;

  @property({
    type: 'number',
    required: true,
    description:
      'The salary of the employee in the currency stored in currency_id column',
  })
  salary: number;
  ...
}
  • Model description - this is the primary description of the model, it is used to select model for generation, so it should only define the purpose of the model itself.
  • Model context - this is secondary information about the model, usually defining some specific details about the model that must be kept in mind while using it. NOTE - These values should always include the model name. This must be information that is applicable to overall model usage, or atleast to multiple columns, and not related to any single field of the model.
  • Property description - this is the description for a property of a model, providing context for the LLM on how to use and understand a particular property.

Usage

You just need to register your models in the configuration of the component, and if the Models have proper and detailed descriptions, the tools should be able to answer the user's prompts based on those descriptions.

this.bind(DbQueryAIExtensionBindings.Config).to({
  models: [
    {
      model: Employee, // A normal loopback4 model class with proper description
      readPermissionKey: '1', // permission key used to check access for this particular model/table
    },
  ],
  db: {
    dialect: SupportedDBs.PostgreSQL, // dialect for which the SQL will be generated.
    schema: 'public', // schema of the database in case of DBs like Postgresql
  },
});
this.component(DbQueryComponent);

You also need to create a loopback4 datasource with the name - db, but if you have an existing datasource that you want to use, you can provide it's name like this -

this.bind(DatasetServiceBindings.Config).to({
  datasourceName: 'datasetdb',
});

Writing Your Own Tool

You can register your own tools by simply using the @graphTool() decorator and implementing the IGraphTool interface. Any such class would be automatically registered with the /generate endpoint and the LLM would be able to use it as a tool.

import {tool} from '@langchain/core/tools';
import z from 'zod';
import {graphTool, IGraphTool} from 'lb4-llm-chat-component';

...
@graphTool()
export class AddTool implements IGraphTool {
  needsReview = false;

  build() {
    return tool((ob: {a: number, b: number}) => {
        return ob.a + ob.b
    },
    {
        name: 'add-tool',
        description: 'a tool to add two numbers',
        schema: z.object({
            a: z.number(),
            b: z.number()
        })
    });
  }
}

About

llm-chat-component

Resources

License

Code of conduct

Stars

Watchers

Forks

Packages

No packages published

Contributors 2

  •  
  •