Skip to content

Commit 3571920

Browse files
[ML] AI Connector/Inference endpoints creation UI: Adds icon for AI21 labs and Llama Stack (#232098)
## Summary Fixes #230463 The corresponding back-end [pull request](elastic/elasticsearch#133233) has not been merged yet to avoid showing AI21 Labs and Llama Stack providers without icons in the list of available connector services. This PR is intended to go first for that reason. Once we receive the updated list of services, the front end will be ready to display them correctly. I’ve been provided with the expected response objects: ### AI21 { "service": "ai21", "name": "AI21", "task_types": [ "completion", "chat_completion" ], "configurations": { "api_key": { "description": "API Key for the provider you're connecting to.", "label": "API Key", "required": true, "sensitive": true, "updatable": true, "type": "str", "supported_task_types": [ "completion", "chat_completion" ] }, "rate_limit.requests_per_minute": { "description": "Minimize the number of rate limit errors.", "label": "Rate Limit", "required": false, "sensitive": false, "updatable": false, "type": "int", "supported_task_types": [ "completion", "chat_completion" ] }, "model_id": { "description": "Refer to the AI21 models documentation for the list of available inference models.", "label": "Model", "required": true, "sensitive": false, "updatable": false, "type": "str", "supported_task_types": [ "completion", "chat_completion" ] } } }, ### Llama ``` { "service": "llama", "name": "Llama", "task_types": [ "text_embedding", "completion", "chat_completion" ], "configurations": { "api_key": { "description": "API Key for the provider you're connecting to.", "label": "API Key", "required": true, "sensitive": true, "updatable": true, "type": "str", "supported_task_types": [ "text_embedding", "completion", "chat_completion" ] }, "rate_limit.requests_per_minute": { "description": "Minimize the number of rate limit errors.", "label": "Rate Limit", "required": false, "sensitive": false, "updatable": false, "type": "int", "supported_task_types": [ "text_embedding", "completion", "chat_completion" ] }, "model_id": { "description": "Refer to the Llama models documentation for the list of available models.", "label": "Model", "required": true, "sensitive": false, "updatable": false, "type": "str", "supported_task_types": [ "text_embedding", "completion", "chat_completion" ] }, "url": { "description": "The URL endpoint to use for the requests.", "label": "URL", "required": true, "sensitive": false, "updatable": false, "type": "str", "supported_task_types": [ "text_embedding", "completion", "chat_completion" ] } } }, ``` ### Checklist Check the PR satisfies following conditions. Reviewers should verify this PR satisfies this list as well. - [ ] Any text added follows [EUI's writing guidelines](https://elastic.github.io/eui/#/guidelines/writing), uses sentence case text and includes [i18n support](https://github.com/elastic/kibana/blob/main/src/platform/packages/shared/kbn-i18n/README.md) - [ ] [Documentation](https://www.elastic.co/guide/en/kibana/master/development-documentation.html) was added for features that require explanation or tutorials - [ ] [Unit or functional tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html) were updated or added to match the most common scenarios - [ ] If a plugin configuration key changed, check if it needs to be allowlisted in the cloud and added to the [docker list](https://github.com/elastic/kibana/blob/main/src/dev/build/tasks/os_packages/docker_generator/resources/base/bin/kibana-docker) - [ ] This was checked for breaking HTTP API changes, and any breaking changes have been approved by the breaking-change committee. The `release_note:breaking` label should be applied in these situations. - [ ] [Flaky Test Runner](https://ci-stats.kibana.dev/trigger_flaky_test_runner/1) was used on any tests changed - [ ] The PR description includes the appropriate Release Notes section, and the correct `release_note:*` label is applied per the [guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) - [ ] Review the [backport guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing) and apply applicable `backport:*` labels. ### Identify risks Does this PR introduce any risks? For example, consider risks like hard to test bugs, performance regression, potential of data loss. Describe the risk, its severity, and mitigation for each identified risk. Invite stakeholders and evaluate how to proceed before merging. - [ ] [See some risk examples](https://github.com/elastic/kibana/blob/main/RISK_MATRIX.mdx) - [ ] ... --------- Co-authored-by: Kibana Machine <[email protected]>
1 parent e260f4a commit 3571920

File tree

4 files changed

+72
-0
lines changed

4 files changed

+72
-0
lines changed
Lines changed: 32 additions & 0 deletions
Loading
Lines changed: 26 additions & 0 deletions
Loading

x-pack/platform/packages/shared/kbn-inference-endpoint-ui-common/src/components/providers/render_service_provider/service_provider.tsx

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -31,6 +31,8 @@ import ibmWatsonxIcon from '../assets/images/ibm_watsonx.svg';
3131
import jinaAIIcon from '../assets/images/jinaai.svg';
3232
import voyageAIIcon from '../assets/images/voyageai.svg';
3333
import deepSeekIcon from '../assets/images/deepseek.svg';
34+
import ai21Icon from '../assets/images/ai21_labs_default.svg';
35+
import llamaIcon from '../assets/images/llama_stack_default.svg';
3436

3537
interface ServiceProviderProps {
3638
providerKey: ServiceProviderKeys;
@@ -146,6 +148,16 @@ export const SERVICE_PROVIDERS: Record<ServiceProviderKeys, ServiceProviderRecor
146148
name: 'DeepSeek',
147149
solutions: ['Search'],
148150
},
151+
[ServiceProviderKeys.ai21]: {
152+
icon: ai21Icon,
153+
name: 'AI21 labs',
154+
solutions: ['Search'],
155+
},
156+
[ServiceProviderKeys.llama]: {
157+
icon: llamaIcon,
158+
name: 'Llama Stack',
159+
solutions: ['Search'],
160+
},
149161
};
150162

151163
export const ServiceProviderIcon: React.FC<ServiceProviderProps> = ({ providerKey }) => {

x-pack/platform/packages/shared/kbn-inference-endpoint-ui-common/src/constants.tsx

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -30,6 +30,8 @@ export enum ServiceProviderKeys {
3030
openai = 'openai',
3131
voyageai = 'voyageai',
3232
watsonxai = 'watsonxai',
33+
ai21 = 'ai21',
34+
llama = 'llama',
3335
}
3436

3537
export const GEMINI_REGION_DOC_LINK = (

0 commit comments

Comments
 (0)