You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This PR updates the doc for `Langchain::LLM::Ollama`.
In practice, `Langchain::LLM::Ollama` depends on the Faraday gem.
If `gem "faraday"` is not listed as a dependency in user's Gemfile, the following error occurs:
```console
$ bundle exec ruby example.rb
/Users/koic/.rbenv/versions/3.3.4/lib/ruby/gems/3.3.0/gems/langchainrb-0.19.5/lib/langchain/dependency_helper.rb:38:
in `rescue in depends_on': Could not load faraday. Please ensure that the faraday gem is installed. (Langchain::DependencyHelper::LoadError)
from /Users/koic/.rbenv/versions/3.3.4/lib/ruby/gems/3.3.0/gems/langchainrb-0.19.5/lib/langchain/dependency_helper.rb:17:in `depends_on'
from /Users/koic/.rbenv/versions/3.3.4/lib/ruby/gems/3.3.0/gems/langchainrb-0.19.5/lib/langchain/llm/ollama.rb:41:in `initialize'
```
In accordance with the langchain.gemspec, no version constraints are specified:
https://github.com/patterns-ai-core/langchainrb/blob/0.19.5/langchain.gemspec#L54
Co-authored-by: Andrei Bondarev <[email protected]>
0 commit comments