Skip to content

Commit 81af55a

Browse files
koicandreibondarev
andauthored
[Doc] Update the doc for Langchain::LLM::Ollama (#988)
This PR updates the doc for `Langchain::LLM::Ollama`. In practice, `Langchain::LLM::Ollama` depends on the Faraday gem. If `gem "faraday"` is not listed as a dependency in user's Gemfile, the following error occurs: ```console $ bundle exec ruby example.rb /Users/koic/.rbenv/versions/3.3.4/lib/ruby/gems/3.3.0/gems/langchainrb-0.19.5/lib/langchain/dependency_helper.rb:38: in `rescue in depends_on': Could not load faraday. Please ensure that the faraday gem is installed. (Langchain::DependencyHelper::LoadError) from /Users/koic/.rbenv/versions/3.3.4/lib/ruby/gems/3.3.0/gems/langchainrb-0.19.5/lib/langchain/dependency_helper.rb:17:in `depends_on' from /Users/koic/.rbenv/versions/3.3.4/lib/ruby/gems/3.3.0/gems/langchainrb-0.19.5/lib/langchain/llm/ollama.rb:41:in `initialize' ``` In accordance with the langchain.gemspec, no version constraints are specified: https://github.com/patterns-ai-core/langchainrb/blob/0.19.5/langchain.gemspec#L54 Co-authored-by: Andrei Bondarev <[email protected]>
1 parent ea6c3d1 commit 81af55a

File tree

1 file changed

+3
-0
lines changed

1 file changed

+3
-0
lines changed

lib/langchain/llm/ollama.rb

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,9 @@ module Langchain::LLM
44
# Interface to Ollama API.
55
# Available models: https://ollama.ai/library
66
#
7+
# Gem requirements:
8+
# gem "faraday"
9+
#
710
# Usage:
811
# llm = Langchain::LLM::Ollama.new(url: ENV["OLLAMA_URL"], default_options: {})
912
#

0 commit comments

Comments
 (0)