-
Notifications
You must be signed in to change notification settings - Fork 8.5k
Open
[Epic] Enabling inference AI Connector as a default experience for all Kibana GenAI functionality#207140
Epic
2 / 72 of 7 issues completed
Copy link
Description
Summary
Inference AI Connector is going to be the single and default experience for the Elastic and external LLM integration experience in Kibana. With that in mind, all the existing Kibana GenAI Connectors should be deprecated and the all Kibana usage of this connector should be migrated to the Inference AI Connector.
Goals
- Remove feature flag and make connector GA
- Add support for unified completion spec approved in https://github.com/elastic/search-team/issues/8456 (openai compatible completion schema) as a standard for all the providers, which Kibana GenAI functionality has in usage (
openai,amazonbedrock,googlegminiand others). For now Inference API supports this foropenaiprovider only, implemented by PR [Inference API] Add unified api for chat completions elasticsearch#117589 - Add support for tokens usage Dashboard, when inference API will include the used tokens count in the response.
- Add UX and proper functionality for migration from existing specific AI connectors to the Inference connector.
- For the inference AI Connector modal/flyout, preselect provider, unified completion task type and collapse Additional options, when the connector has Kibana AI Assistant as a context (path through the extending context implementation on the connector framework level).
- Substitute the usage of the existing
.openai,.bedrockand.geminiconnectors with the corresponding AI Connector providers in Kibana GenAI functionality. - Deprecate usage of the existing connectors
.openai,.bedrockand.gemini.
Tasks
All tasks are in sub-issues under this epic.