ChatOllama Not Respecting baseUrl Configuration: Converts localhost
to 127.0.0.1
#5878
Replies: 3 comments
-
Hey @bajrangCoder! I'm here to help you with any bugs, questions, or contributions you have for the repository. Let's tackle this issue together! To prevent LangChain from converting Here is an example of how you can configure the import { ChatOllama } from "@langchain/community/chat_models/ollama";
let modelInstance = new ChatOllama({
baseUrl: "http://your-preferred-hostname:11434",
model: "phi3"
}); This will ensure that the |
Beta Was this translation helpful? Give feedback.
-
That's just useless, langchain just converts But I don't want that conversion. |
Beta Was this translation helpful? Give feedback.
-
@bajrangCoder you can check the createOllamaStream in node_modules, theres a line automatically change localhost to 127.0.0.1 due to Nodejs 18 problem , if you dont want this to happen you can modify that line inside node_modules |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
I am encountering an issue with LangChain when specifying the base URL to
localhost
. Despite setting the base URL tohttp://localhost:11434
, LangChain internally uses127.0.0.1
. My application is configured to disable requests to IP addresses, allowing only hostname-based requests, which causes the requests to fail.Issue Details:
http://localhost:11434
localhost
to127.0.0.1
internally.127.0.0.1
.LangChain should respect the base URL specified as
localhost
without converting it to127.0.0.1
.Is there a configuration or workaround to prevent LangChain from converting
localhost
to127.0.0.1
, or to ensure it makes requests using the hostname?Any guidance or suggestions to resolve this issue would be greatly appreciated. Thank you!
System Info
@langchain/community: "^0.2.5",
@langchain/core: "^0.2.5",
Beta Was this translation helpful? Give feedback.
All reactions