Skip to content

Conversation

@davidkyle
Copy link
Member

This change allows different services in the Inference API to be licensed at different levels by making the licence check per service. EIS is licensed at the basic level, everything else remains enterprise.

# Conflicts:
#	x-pack/plugin/inference/src/main/java/org/elasticsearch/xpack/inference/action/BaseTransportInferenceAction.java
#	x-pack/plugin/inference/src/main/java/org/elasticsearch/xpack/inference/action/filter/ShardBulkInferenceActionFilter.java
@elasticsearchmachine elasticsearchmachine added the Team:ML Meta label for the ML team label Oct 31, 2025
@elasticsearchmachine
Copy link
Collaborator

Pinging @elastic/ml-core (Team:ML)

@elasticsearchmachine
Copy link
Collaborator

Hi @davidkyle, I've created a changelog YAML for you.

Copy link
Member

@benwtrent benwtrent left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

easy peasy. It makes sense to handle all the license checks as normal, but pivoting on type

Comment on lines 385 to 395
if (InferenceLicenceCheck.isServiceLicenced(inferenceProvider.service.name(), licenseState) == false) {
try (onFinish) {
for (FieldInferenceRequest request : requests) {
addInferenceResponseFailure(
request.bulkItemIndex,
InferenceLicenceCheck.complianceException(inferenceProvider.service.name())
);
}
return;
}
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@carlosdelest what do you think of this? It seems OK to me. Both the old and new license check failures are per bulk item request. Now we just delay it until we have the inference provider loaded.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That makes sense - we're checking the inference provider service when we have it. As requests are grouped by inference provider, we can fail all of them at once in case it's not compliant. 👍


public static final LicensedFeature.Momentary EIS_INFERENCE_FEATURE = LicensedFeature.momentary(
"inference",
"eis",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Its a bit confusing to me, the ElasticInferenceService.NAME is elastic, but our license service name is eis. So the user configures elastic to get the license with name eis.

Could we name this elastic to match the name of the ElasticInferenceService.NAME?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've changed this to the proper name "Elastic Inference Service"

The error the user sees if the licence is not compatible is either:

"current license is non-compliant for [inference]"

Or

"current license is non-compliant for [Elastic Inference Service]"

Copy link
Contributor

@jonathan-buttner jonathan-buttner left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, I agree with Ben, could we have license string be elastic instead of eis? Or maybe Elastic Inference Service?

return;
}

if (InferenceLicenceCheck.isServiceLicenced(serviceName, licenseState) == false) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: I wonder if we should move the reserved ID check to below this check?

# Conflicts:
#	x-pack/plugin/inference/src/main/java/org/elasticsearch/xpack/inference/action/TransportPutInferenceModelAction.java
@davidkyle davidkyle enabled auto-merge (squash) November 4, 2025 10:59
@davidkyle davidkyle merged commit 54c35e0 into elastic:main Nov 4, 2025
34 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

>enhancement :ml Machine learning Team:ML Meta label for the ML team v9.3.0

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants