Skip to content

[Epic] Enabling inference AI Connector as a default experience for all Kibana GenAI functionality #207140

@YulNaumenko

Description

@YulNaumenko

Summary

Inference AI Connector is going to be the single and default experience for the Elastic and external LLM integration experience in Kibana. With that in mind, all the existing Kibana GenAI Connectors should be deprecated and the all Kibana usage of this connector should be migrated to the Inference AI Connector.

Goals

  • Remove feature flag and make connector GA
  • Add support for unified completion spec approved in https://github.com/elastic/search-team/issues/8456 (openai compatible completion schema) as a standard for all the providers, which Kibana GenAI functionality has in usage (openai, amazonbedrock, googlegmini and others). For now Inference API supports this for openai provider only, implemented by PR [Inference API] Add unified api for chat completions elasticsearch#117589
  • Add support for tokens usage Dashboard, when inference API will include the used tokens count in the response.
  • Add UX and proper functionality for migration from existing specific AI connectors to the Inference connector.
  • For the inference AI Connector modal/flyout, preselect provider, unified completion task type and collapse Additional options, when the connector has Kibana AI Assistant as a context (path through the extending context implementation on the connector framework level).
  • Substitute the usage of the existing .openai, .bedrock and .gemini connectors with the corresponding AI Connector providers in Kibana GenAI functionality.
  • Deprecate usage of the existing connectors .openai, .bedrock and .gemini.

Tasks

All tasks are in sub-issues under this epic.

Sub-issues

Metadata

Metadata

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions