-
Notifications
You must be signed in to change notification settings - Fork 6
Implement multi-provider support for chat functionality and enhance configuration management #26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
589580e
003af80
7153f1a
c8db6af
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,41 @@ | ||
| module RubyAI | ||
| class Chat | ||
| attr_accessor :provider, :model, :temperature | ||
|
|
||
| def initialize(provider, model: nil, temperature: 0.7) | ||
| @provider = provider | ||
| @model = model || RubyAI::Configuration::DEFAULT_MODEL | ||
| @temperature = temperature | ||
| end | ||
|
|
||
| def call(messages) | ||
| raise ArgumentError, "Messages cannot be empty" if messages.nil? || messages.empty? | ||
|
|
||
| body = HTTP.build_body(messages, @provider, @model, @temperature) | ||
| headers = HTTP.build_headers(provider, RubyAI.config) | ||
|
|
||
| response = connection.post do |req| | ||
| req.url Configuration::PROVIDERS[@provider] || Configuration::BASE_URL | ||
| req.headers.merge!(headers) | ||
| req.body = body.to_json | ||
| end | ||
|
|
||
| JSON.parse(response.body) | ||
| end | ||
|
|
||
| private | ||
|
|
||
| def connection | ||
| @connection ||= Faraday.new do |faraday| | ||
| faraday.adapter Faraday.default_adapter | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Adapter should come last
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. its just an copy+paste from client, so i couldn't know that it should come last |
||
| faraday.headers['Content-Type'] = 'application/json' | ||
| end | ||
| rescue Faraday::Error => e | ||
| raise "Connection error: #{e.message}" | ||
| rescue JSON::ParserError => e | ||
| raise "Response parsing error: #{e.message}" | ||
| rescue StandardError => e | ||
| raise "An unexpected error occurred: #{e.message}" | ||
| end | ||
| end | ||
| end | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -2,19 +2,49 @@ module RubyAI | |
| module HTTP | ||
| extend self | ||
|
|
||
| def build_body(messages, model, temperature) | ||
| { | ||
| 'model': Configuration::MODELS[model], | ||
| def build_body(messages, provider, model, temperature) | ||
| case provider | ||
| when 'openai' | ||
| { | ||
| 'model': Configuration::MODELS[provider][model], | ||
| 'messages': [{ "role": "user", "content": messages }], | ||
| 'temperature': temperature | ||
| } | ||
| } | ||
| when 'anthropic' | ||
| { | ||
| 'model' => Configuration::MODELS[provider][model], | ||
| 'max_tokens' => 1024, # Required parameter for Anthropic API | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Why you stay with 1024 tokens? Can't we increase this?
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. my bad, will be fixed in future commits, because in one of the next PR i've changed configuration of providers I simply forgot about this thing. |
||
| 'messages' => format_messages_for_antropic(messages), | ||
| 'temperature' => temperature | ||
| } | ||
| end | ||
| end | ||
|
|
||
| def build_headers(api_key) | ||
| { | ||
| def build_headers(provider, config) | ||
| case provider | ||
| when 'openai' | ||
| { | ||
| 'Content-Type': 'application/json', | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Faraday will add this header automatically
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. this line will be removed in future commits |
||
| 'Authorization': "Bearer #{api_key}" | ||
| } | ||
| 'Authorization': "Bearer #{config.openai_api_key}" | ||
| } | ||
| when 'anthropic' | ||
| { | ||
| 'x-api-key' => config.anthropic_api_key, | ||
| 'anthropic-version' => '2023-06-01' | ||
| } | ||
| end | ||
| end | ||
|
|
||
| private | ||
|
|
||
| def format_messages_for_antropic(messages) | ||
| # Messages should be an array of message objects | ||
| # Each message needs 'role' (either 'user' or 'assistant') and 'content' | ||
| if messages.is_a?(String) | ||
| [{ 'role' => 'user', 'content' => messages }] | ||
| else | ||
| messages | ||
| end | ||
| end | ||
| end | ||
| end | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,15 @@ | ||
| module RubyAI | ||
| module Provider | ||
| PROVIDERS = { | ||
| 'openai' => RubyAI::Providers::OpenAI, | ||
| # doesn't tested yet because i don't have an anthropic api key | ||
| 'anthropic' => RubyAI::Providers::Anthropic | ||
| } | ||
|
|
||
| module_function | ||
|
|
||
| def [](provider) | ||
| PROVIDERS.fetch(provider) | ||
| end | ||
| end | ||
| end |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,17 @@ | ||
| module RubyAI | ||
| module Providers | ||
| # doesn't tested yet because i don't have an anthropic api key | ||
| class Anthropic | ||
| def self.models = { | ||
| "claude-2" => "claude-2", | ||
| "claude-instant-100k" => "claude-instant-100k", | ||
| "claude-1" => "claude-1", | ||
| "claude-1.3" => "claude-1.3", | ||
| "claude-1.3-sonnet" => "claude-1.3-sonnet", | ||
| "claude-1.3-sonnet-100k" => "claude-1.3-sonnet-100k" | ||
| }.freeze | ||
| end | ||
|
|
||
| # todo: configuration of separate models | ||
| end | ||
| end |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,20 @@ | ||
| module RubyAI | ||
| module Providers | ||
| class OpenAI | ||
| DEFAULT_MODEL = "gpt-3.5-turbo".freeze | ||
|
|
||
| def self.models | ||
| {"gpt-3.5-turbo" => "gpt-3.5-turbo", | ||
| "gpt-4" => "gpt-4", | ||
| "gpt-4-32k" => "gpt-4-32k", | ||
| "gpt-4-turbo" => "gpt-4-turbo", | ||
| "gpt-4o-mini" => "gpt-4o-mini", | ||
| "o1-mini" => "o1-mini", | ||
| "o1-preview" => "o1-preview", | ||
| "text-davinci-003" => "text-davinci-003" } | ||
| end | ||
|
|
||
| # todo: configuration of separate models | ||
| end | ||
| end | ||
| end |
This file was deleted.
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,2 @@ | ||
| require_relative '../../lib/rubyai/chat' | ||
| require 'webmock/rspec' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Faraday can manage Json automatically
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the same thing as in the previous message