- 
                Notifications
    You must be signed in to change notification settings 
- Fork 46
Description
As a user, I want to be able to specify whether a deployed TrustyAIService should configure a specific model server (e.g. ModelMesh or KServe) or not.
The operator should provide defaults if this configuration is not present.
As an example, if I do not want KServe inference services to be automatically configured, but ModelMesh ones to be, I would add the following to a TrustyAIService CR:
payloadProcessor:
   modelmesh: yes
   kserve: noValidation
When deploying a TrustyAIService with
payloadProcessor:
   modelmesh: yes
   kserve: noand a ModelMesh inference service: ModelMesh should be configured.
When deploying a TrustyAIService with
payloadProcessor:
   modelmesh: no
   kserve: yeswith ModelMesh inference service: ModelMesh should be not configured. With a KServe inference service: it should be configured.
When deploying a TrustyAIService with
payloadProcessor:
   modelmesh: yes
   kserve: yesAny inference service (ModelMesh or KServe) should be configured.
Metadata
Metadata
Assignees
Labels
Type
Projects
Status