Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
# --------------------------------------------------------------------
# Copyright (c) 2025, WSO2 LLC. (https://www.wso2.com).
#
# WSO2 LLC. licenses this file to you under the Apache License,
# Version 2.0 (the "License"); you may not use this file except
# in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
# --------------------------------------------------------------------

apiVersion: gateway.api-platform.wso2.com/v1alpha1
kind: LlmProviderTemplate
metadata:
name: mistralai
spec:
displayName: MistralAI
promptTokens:
location: payload
identifier: $.usage.prompt_tokens
completionTokens:
location: payload
identifier: $.usage.completion_tokens
totalTokens:
location: payload
identifier: $.usage.total_tokens
remainingTokens:
location: header
identifier: x-ratelimit-remaining-tokens
requestModel:
location: payload
identifier: $.model
responseModel:
location: payload
identifier: $.model
159 changes: 159 additions & 0 deletions gateway/it/features/llm-provider-templates.feature
Original file line number Diff line number Diff line change
@@ -0,0 +1,159 @@
# --------------------------------------------------------------------
# Copyright (c) 2025, WSO2 LLC. (https://www.wso2.com).
#
# WSO2 LLC. licenses this file to you under the Apache License,
# Version 2.0 (the "License"); you may not use this file except
# in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
# --------------------------------------------------------------------

Feature: LLM Provider Template Management
As an API administrator
I want to manage LLM provider templates in the gateway
So that I can configure token tracking and model extraction metadata for different LLM providers

Background:
Given the gateway services are running

# ========================================
# Scenario Group 1: Template Lifecycle (Happy Path)
# ========================================

Scenario: Complete template lifecycle - create, retrieve, update, and delete
Given I authenticate using basic auth as "admin"
When I create this LLM provider template:
"""
apiVersion: gateway.api-platform.wso2.com/v1alpha1
kind: LlmProviderTemplate
metadata:
name: openai-test
spec:
displayName: OpenAI
promptTokens:
location: payload
identifier: $.usage.prompt_tokens
completionTokens:
location: payload
identifier: $.usage.completion_tokens
totalTokens:
location: payload
identifier: $.usage.total_tokens
remainingTokens:
location: header
identifier: x-ratelimit-remaining-tokens
requestModel:
location: payload
identifier: $.model
responseModel:
location: payload
identifier: $.model
"""
Then the response status code should be 201
And the response should be valid JSON
And the JSON response field "status" should be "success"
And the JSON response field "id" should be "openai-test"
And the JSON response field "message" should be "LLM provider template created successfully"

Given I authenticate using basic auth as "admin"
When I retrieve the LLM provider template "openai-test"
Then the response status code should be 200
And the response should be valid JSON
And the JSON response field "status" should be "success"
And the JSON response field "template.id" should be "openai-test"
And the JSON response field "template.configuration.spec.displayName" should be "OpenAI"
And the JSON response field "template.configuration.spec.promptTokens.location" should be "payload"
And the JSON response field "template.configuration.spec.promptTokens.identifier" should be "$.usage.prompt_tokens"

Given I authenticate using basic auth as "admin"
When I update the LLM provider template "openai-test" with:
"""
apiVersion: gateway.api-platform.wso2.com/v1alpha1
kind: LlmProviderTemplate
metadata:
name: openai-test
spec:
displayName: OpenAI Updated
promptTokens:
location: payload
identifier: $.usage.promptTokens
completionTokens:
location: payload
identifier: $.usage.completion_tokens
totalTokens:
location: payload
identifier: $.usage.total_tokens
remainingTokens:
location: header
identifier: x-ratelimit-remaining-tokens
requestModel:
location: payload
identifier: $.model
responseModel:
location: payload
identifier: $.model
"""
Then the response status code should be 200
And the response should be valid JSON
And the JSON response field "status" should be "success"
And the JSON response field "id" should be "openai-test"
And the JSON response field "message" should be "LLM provider template updated successfully"

Given I authenticate using basic auth as "admin"
When I retrieve the LLM provider template "openai-test"
Then the response status code should be 200
And the JSON response field "template.configuration.spec.displayName" should be "OpenAI Updated"
And the JSON response field "template.configuration.spec.promptTokens.location" should be "payload"
And the JSON response field "template.configuration.spec.promptTokens.identifier" should be "$.usage.promptTokens"

Given I authenticate using basic auth as "admin"
When I delete the LLM provider template "openai-test"
Then the response status code should be 200
And the JSON response field "status" should be "success"
And the JSON response field "message" should be "LLM provider template deleted successfully"

Given I authenticate using basic auth as "admin"
When I retrieve the LLM provider template "openai-test"
Then the response status code should be 404
And the response should be valid JSON
And the JSON response field "status" should be "error"

Scenario: Create template with minimal required fields
Given I authenticate using basic auth as "admin"
When I create this LLM provider template:
"""
apiVersion: gateway.api-platform.wso2.com/v1alpha1
kind: LlmProviderTemplate
metadata:
name: minimal-template
spec:
displayName: Minimal Template
"""
Then the response status code should be 201
And the response should be valid JSON
And the JSON response field "id" should be "minimal-template"

Given I authenticate using basic auth as "admin"
When I retrieve the LLM provider template "minimal-template"
Then the response status code should be 200
And the JSON response field "template.configuration.spec.displayName" should be "Minimal Template"

Given I authenticate using basic auth as "admin"
When I delete the LLM provider template "minimal-template"
Then the response status code should be 200

Scenario: List LLM provider templates returns valid JSON with OOB Templates
Given I authenticate using basic auth as "admin"
When I list all LLM provider templates
Then the response status code should be 200
And the response should be valid JSON
And the JSON response field "status" should be "success"
And the response should contain oob-templates
116 changes: 116 additions & 0 deletions gateway/it/steps_llm.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,116 @@
/*
* Copyright (c) 2025, WSO2 LLC. (https://www.wso2.com).
*
* WSO2 LLC. licenses this file to you under the Apache License,
* Version 2.0 (the "License"); you may not use this file except
* in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/

package it

import (
"encoding/json"
"fmt"

"github.com/cucumber/godog"
"github.com/wso2/api-platform/gateway/it/steps"
)

// RegisterLLMSteps registers all LLM provider template step definitions
func RegisterLLMSteps(ctx *godog.ScenarioContext, state *TestState, httpSteps *steps.HTTPSteps) {
ctx.Step(`^I create this LLM provider template:$`, func(body *godog.DocString) error {
httpSteps.SetHeader("Content-Type", "application/yaml")
return httpSteps.SendPOSTToService("gateway-controller", "/llm-provider-templates", body)
})

ctx.Step(`^I retrieve the LLM provider template "([^"]*)"$`, func(templateID string) error {
return httpSteps.SendGETToService("gateway-controller", "/llm-provider-templates/"+templateID)
})

ctx.Step(`^I update the LLM provider template "([^"]*)" with:$`, func(templateID string, body *godog.DocString) error {
httpSteps.SetHeader("Content-Type", "application/yaml")
return httpSteps.SendPUTToService("gateway-controller", "/llm-provider-templates/"+templateID, body)
})

ctx.Step(`^I delete the LLM provider template "([^"]*)"$`, func(templateID string) error {
return httpSteps.SendDELETEToService("gateway-controller", "/llm-provider-templates/"+templateID)
})

ctx.Step(`^I list all LLM provider templates$`, func() error {
return httpSteps.SendGETToService("gateway-controller", "/llm-provider-templates")
})

ctx.Step(`^I list LLM provider templates with filter "([^"]*)" as "([^"]*)"$`, func(filterKey, filterValue string) error {
return httpSteps.SendGETToService("gateway-controller", "/llm-provider-templates?"+filterKey+"="+filterValue)
})

ctx.Step(`^the response should contain oob-templates$`, func() error {
// This step verifies that out-of-box templates are present in the list response
// The actual assertion is done by checking the response body
body := httpSteps.LastBody()
if len(body) == 0 {
return fmt.Errorf("expected non-empty response body for oob-templates assertion")
}
// The actual validation of OOB templates should be done using JSON assertions
// in the feature file itself, so this step just ensures we got a response
var response struct {
Count int `json:"count"`
Templates []struct {
ID string `json:"id"`
} `json:"templates"`
}

if err := json.Unmarshal([]byte(body), &response); err != nil {
return fmt.Errorf("failed to parse response JSON: %w", err)
}

// 1️⃣ Expected OOB template IDs
expectedIDs := []string{
"azureai-foundry",
"anthropic",
"openai",
"gemini",
"azure-openai",
"mistralai",
"awsbedrock",
}

// 2️⃣ Validate count is at least the expected set
expectedCount := len(expectedIDs)
if response.Count < expectedCount {
return fmt.Errorf(
"expected template count to be >= %d, but got %d",
expectedCount,
response.Count,
)
}

// 3️⃣ Collect actual template IDs
actualIDs := make(map[string]bool)
for _, t := range response.Templates {
actualIDs[t.ID] = true
}

// 4️⃣ Validate all expected IDs are present
for _, expectedID := range expectedIDs {
if !actualIDs[expectedID] {
return fmt.Errorf(
"expected oob-template with id '%s' was not found in response",
expectedID,
)
}
}

return nil
})
}
2 changes: 2 additions & 0 deletions gateway/it/suite_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -80,6 +80,7 @@ func TestFeatures(t *testing.T) {
"features/basic-ratelimit.feature",
"features/jwt-auth.feature",
"features/cors.feature",
"features/llm-provider-templates.feature",
},
TestingT: t,
},
Expand Down Expand Up @@ -236,6 +237,7 @@ func InitializeScenario(ctx *godog.ScenarioContext) {
RegisterAPISteps(ctx, testState, httpSteps)
RegisterMCPSteps(ctx, testState, httpSteps)
RegisterJWTSteps(ctx, testState, httpSteps, jwtSteps)
RegisterLLMSteps(ctx, testState, httpSteps)
}

// Register common HTTP and assertion steps
Expand Down
Loading