Skip to content

Commit 3f3d45e

Browse files
committed
Rename Langchain.rb to LangChain.rb
In the LLM field, the name LangChain is widely recognized, and the project was likely originally intended to be called LangChain.rb as a product: ab9d988 It appears that the current use of the name `Langchain` is due to implementation constraints. This PR updates the behavior so that Zeitwerk also recognizes the `LangChain` module name for the `langchain.rb` file. Both the existing `Langchain` and the new `LangChain` module names will work. However, the `Langchain` constant is now marked as deprecated, and a warning will be shown when it is used: ```console $ bundle exec ruby -Ilib -rrails -rlangchainrb -e 'Langchain::Errors' -e:1: warning: `LangChain` is deprecated. Use `LangChain` instead. ``` This change makes it possible to use the name `LangChain`, which is the standard name in the AI ecosystem. Because `LangChain` is a proper noun, renaming to this widely recognized and commonly used name follows the principle of least surprise for users.
1 parent fca4056 commit 3f3d45e

File tree

224 files changed

+912
-894
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

224 files changed

+912
-894
lines changed

.github/ISSUE_TEMPLATE/bug_report.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ If applicable, add screenshots to help explain your problem.
2525

2626
- OS: [e.g. OS X, Linux, Ubuntu, Windows]
2727
- Ruby version [e.g. 3.1, 3.2, 3.3]
28-
- Langchain.rb version [e.g. 0.13.0]
28+
- LangChain.rb version [e.g. 0.13.0]
2929

3030
**Additional context**
31-
Add any other context about the problem here.
31+
Add any other context about the problem here.

.standard.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,5 +2,5 @@ ignore:
22
- "**/*":
33
- Style/ArgumentsForwarding
44

5-
# Specify the minimum supported Ruby version supported by Langchain.rb.
5+
# Specify the minimum supported Ruby version supported by LangChain.rb.
66
ruby_version: 3.1

CHANGELOG.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,7 @@
1010
- [SECURITY]: A change which fixes a security vulnerability.
1111

1212
## [Unreleased]
13+
- [COMPAT] [https://github.com/patterns-ai-core/langchainrb/pull/999] Rename `Langchain` to `LangChain`.
1314
- [COMPAT] [https://github.com/patterns-ai-core/langchainrb/pull/980] Suppress a Ruby 3.4 warning for URI parser.
1415
- [BREAKING] [https://github.com/patterns-ai-core/langchainrb/pull/997] Remove `Langchain::Vectorsearch::Epsilla` class
1516
- [BREAKING] [https://github.com/patterns-ai-core/langchainrb/pull/1003] Response classes are now namespaced under `Langchain::LLM::Response`, converted to Rails engine

README.md

Lines changed: 61 additions & 61 deletions
Large diffs are not rendered by default.

config/routes.rb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,2 @@
1-
Langchain::Engine.routes.draw do
1+
LangChain::Engine.routes.draw do
22
end

examples/assistant_chat.rb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -6,12 +6,12 @@
66
# gem install reline
77
# or add `gem "reline"` to your Gemfile
88

9-
openai = Langchain::LLM::OpenAI.new(api_key: ENV["OPENAI_API_KEY"])
10-
assistant = Langchain::Assistant.new(
9+
openai = LangChain::LLM::OpenAI.new(api_key: ENV["OPENAI_API_KEY"])
10+
assistant = LangChain::Assistant.new(
1111
llm: openai,
1212
instructions: "You are a Meteorologist Assistant that is able to pull the weather for any location",
1313
tools: [
14-
Langchain::Tool::Weather.new(api_key: ENV["OPEN_WEATHER_API_KEY"])
14+
LangChain::Tool::Weather.new(api_key: ENV["OPEN_WEATHER_API_KEY"])
1515
]
1616
)
1717

examples/create_and_manage_few_shot_prompt_templates.rb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
11
require "langchain"
22

33
# Create a prompt with a few shot examples
4-
prompt = Langchain::Prompt::FewShotPromptTemplate.new(
4+
prompt = LangChain::Prompt::FewShotPromptTemplate.new(
55
prefix: "Write antonyms for the following words.",
66
suffix: "Input: {adjective}\nOutput:",
7-
example_prompt: Langchain::Prompt::PromptTemplate.new(
7+
example_prompt: LangChain::Prompt::PromptTemplate.new(
88
input_variables: ["input", "output"],
99
template: "Input: {input}\nOutput: {output}"
1010
),
@@ -32,5 +32,5 @@
3232
prompt.save(file_path: "spec/fixtures/prompt/few_shot_prompt_template.json")
3333

3434
# Loading a new prompt template using a JSON file
35-
prompt = Langchain::Prompt.load_from_path(file_path: "spec/fixtures/prompt/few_shot_prompt_template.json")
35+
prompt = LangChain::Prompt.load_from_path(file_path: "spec/fixtures/prompt/few_shot_prompt_template.json")
3636
prompt.prefix # "Write antonyms for the following words."
Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,25 +1,25 @@
11
require "langchain"
22

33
# Create a prompt with one input variable
4-
prompt = Langchain::Prompt::PromptTemplate.new(template: "Tell me a {adjective} joke.", input_variables: ["adjective"])
4+
prompt = LangChain::Prompt::PromptTemplate.new(template: "Tell me a {adjective} joke.", input_variables: ["adjective"])
55
prompt.format(adjective: "funny") # "Tell me a funny joke."
66

77
# Create a prompt with multiple input variables
8-
prompt = Langchain::Prompt::PromptTemplate.new(template: "Tell me a {adjective} joke about {content}.", input_variables: ["adjective", "content"])
8+
prompt = LangChain::Prompt::PromptTemplate.new(template: "Tell me a {adjective} joke about {content}.", input_variables: ["adjective", "content"])
99
prompt.format(adjective: "funny", content: "chickens") # "Tell me a funny joke about chickens."
1010

1111
# Creating a PromptTemplate using just a prompt and no input_variables
12-
prompt = Langchain::Prompt::PromptTemplate.from_template("Tell me a {adjective} joke about {content}.")
12+
prompt = LangChain::Prompt::PromptTemplate.from_template("Tell me a {adjective} joke about {content}.")
1313
prompt.input_variables # ["adjective", "content"]
1414
prompt.format(adjective: "funny", content: "chickens") # "Tell me a funny joke about chickens."
1515

1616
# Save prompt template to JSON file
1717
prompt.save(file_path: "spec/fixtures/prompt/prompt_template.json")
1818

1919
# Loading a new prompt template using a JSON file
20-
prompt = Langchain::Prompt.load_from_path(file_path: "spec/fixtures/prompt/prompt_template.json")
20+
prompt = LangChain::Prompt.load_from_path(file_path: "spec/fixtures/prompt/prompt_template.json")
2121
prompt.input_variables # ["adjective", "content"]
2222

2323
# Loading a new prompt template using a YAML file
24-
prompt = Langchain::Prompt.load_from_path(file_path: "spec/fixtures/prompt/prompt_template.yaml")
24+
prompt = LangChain::Prompt.load_from_path(file_path: "spec/fixtures/prompt/prompt_template.yaml")
2525
prompt.input_variables # ["adjective", "content"]

examples/create_and_manage_prompt_templates_using_structured_output_parser.rb

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -37,8 +37,8 @@
3737
required: ["name", "age", "interests"],
3838
additionalProperties: false
3939
}
40-
parser = Langchain::OutputParsers::StructuredOutputParser.from_json_schema(json_schema)
41-
prompt = Langchain::Prompt::PromptTemplate.new(template: "Generate details of a fictional character.\n{format_instructions}\nCharacter description: {description}", input_variables: ["description", "format_instructions"])
40+
parser = LangChain::OutputParsers::StructuredOutputParser.from_json_schema(json_schema)
41+
prompt = LangChain::Prompt::PromptTemplate.new(template: "Generate details of a fictional character.\n{format_instructions}\nCharacter description: {description}", input_variables: ["description", "format_instructions"])
4242
prompt.format(description: "Korean chemistry student", format_instructions: parser.get_format_instructions)
4343
# Generate details of a fictional character.
4444
# You must format your output as a JSON value that adheres to a given "JSON Schema" instance.
@@ -58,7 +58,7 @@
5858

5959
# Character description: Korean chemistry student
6060

61-
llm = Langchain::LLM::OpenAI.new(api_key: ENV["OPENAI_API_KEY"])
61+
llm = LangChain::LLM::OpenAI.new(api_key: ENV["OPENAI_API_KEY"])
6262
llm_response = llm.chat(
6363
messages: [{
6464
role: "user",
@@ -91,7 +91,7 @@
9191
# ```
9292
# RESPONSE
9393

94-
fix_parser = Langchain::OutputParsers::OutputFixingParser.from_llm(
94+
fix_parser = LangChain::OutputParsers::OutputFixingParser.from_llm(
9595
llm: llm,
9696
parser: parser
9797
)

examples/ollama_inquire_about_image.rb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,9 @@
11
require_relative "../lib/langchain"
22
require "faraday"
33

4-
llm = Langchain::LLM::Ollama.new(default_options: {chat_model: "llava"})
4+
llm = LangChain::LLM::Ollama.new(default_options: {chat_model: "llava"})
55

6-
assistant = Langchain::Assistant.new(llm: llm)
6+
assistant = LangChain::Assistant.new(llm: llm)
77

88
response = assistant.add_message_and_run(
99
image_url: "https://gist.githubusercontent.com/andreibondarev/b6f444194d0ee7ab7302a4d83184e53e/raw/099e10af2d84638211e25866f71afa7308226365/sf-cable-car.jpg",

0 commit comments

Comments
 (0)