Skip to content

Conversation

@lambochen
Copy link
Contributor

Thank you for taking time to contribute this pull request!
You might have already read the [contributor guide][1], but as a reminder, please make sure to:

  • Sign the contributor license agreement
  • Rebase your changes on the latest main branch and squash your commits
  • Add/Update unit tests as needed
  • Run a build and make sure all tests pass prior to submission

Hi, This PR is a feature enhancement change that supports custom Ollama API paths.

Background: When implementing Spring AI in my project, I made some adjustments to the Ollama API, and the API provided externally is a custom path. Therefore, I expect the Spring AI Ollama Model to support custom paths.

Current behavior: The Ollama API path is fixed in the hard-coded code, for example:

public ListModelResponse listModels() {
	return this.restClient.get()
			.uri("/api/tags")
			.retrieve()
			.body(ListModelResponse.class);
}

@lambochen
Copy link
Contributor Author

ref: #3471

@lambochen
Copy link
Contributor Author

@ThomasVitale @dev-jonghoonpark
Hi, please review the PR, thank you.

@lambochen lambochen requested a review from yuluo-yx June 8, 2025 16:18
@dev-jonghoonpark
Copy link
Contributor

dev-jonghoonpark commented Jun 9, 2025

Just out of personal curiosity, Can you be more specific with your example?
I've seen the OpenAI API compatible API many times, but this is the first time I've seen the
Ollama API compatible format.

@lambochen
Copy link
Contributor Author

lambochen commented Jun 9, 2025

but this is the first time I've seen the Ollama API compatible format.

Thank you for your interest in this capability.
I am working on an experimental service where ollama is deployed remotely (non-local environment), providing platform services to developers. Due to the numerous APIs exposed by the platform, there has been an issue of path conflicts, so it is necessary to support customizing the Ollama API path.

I've seen the OpenAI API compatible API many times

Yes, for similar solutions, we also have custom path requirements for OpenAI and other models (such as DeepSeek).

@ilayaperumalg ilayaperumalg added this to the 1.1.x milestone Jun 9, 2025
@ThomasVitale
Copy link
Contributor

@lambochen thanks for raising this issue. If I understood the situation correctly, this would seem more a platform issue than a Spring AI issue. Since the Ollama service is provided on-premises by an internal platform, I wonder if the path conflict issue might be fixed more easily by exposing Ollama via a subpath on the platform? I assume the Ollama endpoints will not change (e.g. /api/tags). But the base URL can be customised based on how the platform serves Ollama.

For example, it could be served by the platform from https://my-company-platform.example.net/ollama (which would be configured in the application via spring.ai.ollama.base-url). The full endpoint would then be https://my-company-platform.example.net/ollama/api/tags in the case of tags, which would already work with the current version of Spring AI.

Could that work?

@lambochen
Copy link
Contributor Author

@lambochen thanks for raising this issue. If I understood the situation correctly, this would seem more a platform issue than a Spring AI issue. Since the Ollama service is provided on-premises by an internal platform, I wonder if the path conflict issue might be fixed more easily by exposing Ollama via a subpath on the platform? I assume the Ollama endpoints will not change (e.g. /api/tags). But the base URL can be customised based on how the platform serves Ollama.

For example, it could be served by the platform from https://my-company-platform.example.net/ollama (which would be configured in the application via spring.ai.ollama.base-url). The full endpoint would then be https://my-company-platform.example.net/ollama/api/tags in the case of tags, which would already work with the current version of Spring AI.

Could that work?

@ThomasVitale Yes, your understanding is correct, it is indeed a platform issue (or rather, the platform's configuration does not match the existing Ollama path well).
The suggestion you proposed to add a unified prefix on the platform side is also feasible and can work properly.

If Spring AI supports custom paths, it can better integrate with existing platforms from a user's perspective when the platform is not compatible with Ollama's existing paths.

@markpollack markpollack removed this from the 1.1.0.M1 milestone Sep 25, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants