@@ -4,7 +4,13 @@ A comprehensive end-to-end testing framework for Semantic Router with support fo
44
55## Architecture
66
7- The framework is designed to be extensible and supports multiple test profiles:
7+ The framework follows a ** separation of concerns** design:
8+
9+ - ** Profiles** : Define deployment environments and configurations
10+ - ** Test Cases** : Reusable test logic that can be shared across profiles
11+ - ** Framework** : Core infrastructure for test execution and reporting
12+
13+ ### Supported Profiles
814
915- ** ai-gateway** : Tests Semantic Router with Envoy AI Gateway integration
1016- ** istio** : Tests Semantic Router with Istio Gateway (future)
2430│ ├── cluster/ # Kind cluster management
2531│ ├── docker/ # Docker image operations
2632│ ├── helm/ # Helm deployment utilities
27- │ └── testcases/ # Test case definitions
33+ │ ├── helpers/ # Kubernetes helper functions
34+ │ └── testcases/ # Test case registry
35+ ├── testcases/ # Reusable test cases (shared across profiles)
36+ │ ├── testdata/ # Test data files
37+ │ ├── common.go # Common helper functions
38+ │ ├── chat_completions_request.go
39+ │ ├── domain_classify.go
40+ │ ├── cache.go
41+ │ ├── pii_detection.go
42+ │ └── jailbreak_detection.go
2843├── profiles/
29- │ ├── ai-gateway/ # AI Gateway test profile
30- │ ├── istio/ # Istio test profile (future)
31- │ └── ...
44+ │ └── ai-gateway/ # AI Gateway test profile
45+ │ └── profile.go # Profile definition and environment setup
3246└── README.md
3347```
3448
49+ ## Available Test Cases
50+
51+ The framework includes the following test cases (all in ` e2e/testcases/ ` ):
52+
53+ | Test Case | Description | Metrics |
54+ | -----------| -------------| ---------|
55+ | ` chat-completions-request ` | Basic chat completions API test | Response validation |
56+ | ` domain-classify ` | Domain classification accuracy | 65 cases, accuracy rate |
57+ | ` cache ` | Semantic cache hit rate | 5 groups, cache hit rate |
58+ | ` pii-detection ` | PII detection and blocking | 10 PII types, detection rate, block rate |
59+ | ` jailbreak-detection ` | Jailbreak attack detection | 10 attack types, detection rate, block rate |
60+
61+ All test cases:
62+
63+ - Use model name ` "MoM" `
64+ - Automatically clean up port forwarding
65+ - Generate detailed reports with statistics
66+ - Support verbose logging
67+
3568## Quick Start
3669
3770### Install dependencies (optional)
@@ -65,6 +98,15 @@ make e2e-test USE_EXISTING_CLUSTER=true
6598make e2e-test VERBOSE=true
6699```
67100
101+ ### Test Reports
102+
103+ After running tests, reports are generated:
104+
105+ - ` test-report.json ` : Structured test results
106+ - ` test-report.md ` : Human-readable Markdown report
107+
108+ Each test case also prints detailed statistics to the console.
109+
68110## Adding New Test Profiles
69111
701121 . Create a new directory under ` profiles/ `
@@ -74,32 +116,109 @@ make e2e-test VERBOSE=true
74116
75117See ` profiles/ai-gateway/ ` for a complete example.
76118
119+ ## Key Concepts
120+
121+ ### Profile vs Test Case Separation
122+
123+ ** Profiles** are responsible for:
124+
125+ - Deploying the test environment (Helm charts, Kubernetes resources)
126+ - Verifying environment health
127+ - Providing service configuration (namespace, labels, port mappings)
128+
129+ ** Test Cases** are responsible for:
130+
131+ - Executing test logic
132+ - Validating functionality
133+ - Reporting results
134+
135+ This separation allows test cases to be ** reused across different profiles** by simply providing different service configurations.
136+
137+ ** Benefits:**
138+
139+ - ✅ Test cases are independent of deployment details
140+ - ✅ Easy to add new profiles without duplicating test logic
141+ - ✅ Profiles can share common test cases
142+ - ✅ Test cases can be maintained in one place
143+ - ✅ Clear separation of concerns
144+
145+ ### Service Configuration
146+
147+ Profiles provide service configuration to test cases via ` ServiceConfig ` :
148+
149+ ``` go
150+ type ServiceConfig struct {
151+ LabelSelector string // e.g., "gateway.envoyproxy.io/owning-gateway-namespace=default,..."
152+ Namespace string // Service namespace
153+ Name string // Service name (optional, if empty uses LabelSelector)
154+ PortMapping string // e.g., "8080:80" (localPort:servicePort)
155+ }
156+ ```
157+
158+ Test cases use this configuration to connect to the deployed service without knowing the specific deployment details.
159+
77160## Test Case Registration
78161
79162Test cases are registered using a simple function-based approach:
80163
81164``` go
82165func init () {
83- testcases.Register (" my-test" , testcases.TestCase {
84- Name: " My Test" ,
166+ pkgtestcases.Register (" my-test" , pkgtestcases.TestCase {
85167 Description: " Description of what this test does" ,
86- Fn: func (ctx context.Context , client *kubernetes.Clientset ) error {
87- // Test implementation
88- return nil
89- },
168+ Tags: []string {" functional" , " llm" },
169+ Fn: testMyFeature,
90170 })
91171}
172+
173+ func testMyFeature (ctx context .Context , client *kubernetes .Clientset , opts pkgtestcases .TestCaseOptions ) error {
174+ // Setup connection to service
175+ localPort , stopPortForward , err := setupServiceConnection (ctx, client, opts)
176+ if err != nil {
177+ return err
178+ }
179+ defer stopPortForward () // Always clean up port forwarding
180+
181+ // Test implementation using localPort
182+ // ...
183+ return nil
184+ }
92185```
93186
94187## Framework Features
95188
96189- ** Automatic cluster lifecycle management** : Creates and cleans up Kind clusters
97190- ** Docker image building and loading** : Builds images and loads them into Kind
98191- ** Helm deployment automation** : Deploys required Helm charts
99- - ** Parallel test execution ** : Runs independent tests in parallel
192+ - ** Automatic port forwarding cleanup ** : Each test case cleans up its port forwarding
100193- ** Detailed logging** : Provides comprehensive test output
194+ - ** Test reporting** : Generates JSON and Markdown reports
101195- ** Resource cleanup** : Ensures proper cleanup even on failures
102196
197+ ## Test Data
198+
199+ Test data is stored in ` e2e/testcases/testdata/ ` as JSON files. Each test case loads its own test data.
200+
201+ ** Available Test Data:**
202+
203+ - ` domain_classify_cases.json ` : 65 test cases across 13 categories
204+ - ` cache_cases.json ` : 5 groups of similar questions for cache testing
205+ - ` pii_detection_cases.json ` : 10 PII types (email, phone, SSN, etc.)
206+ - ` jailbreak_detection_cases.json ` : 10 attack types (prompt injection, DAN, etc.)
207+
208+ ** Test Data Format Example:**
209+
210+ ``` json
211+ {
212+ "cases" : [
213+ {
214+ "question" : " What is 2+2?" ,
215+ "expected_category" : " math" ,
216+ "expected_reasoning" : " Basic arithmetic question"
217+ }
218+ ]
219+ }
220+ ```
221+
103222## Prerequisites
104223
105224Before running E2E tests, ensure you have the following tools installed:
@@ -112,34 +231,138 @@ Before running E2E tests, ensure you have the following tools installed:
112231
113232## Development
114233
115- ### Adding a new test case
234+ ### Adding a New Test Case
116235
117- 1 . Create a new test function in ` profiles/<profile>/testcases.go `
118- 2 . Register it in the ` init() ` function
119- 3 . Add the test case name to the profile's ` GetTestCases() ` method
236+ Test cases are created in the ` e2e/testcases/ ` directory and can be reused across multiple profiles.
120237
121- Example:
238+ ** Steps:**
239+
240+ 1 . Create a new file in ` e2e/testcases/ ` (e.g., ` my_feature.go ` )
241+ 2 . Implement the test function with proper cleanup
242+ 3 . Register it in the ` init() ` function
243+ 4 . Add test data to ` e2e/testcases/testdata/ ` if needed
244+ 5 . Add the test case name to any profile's ` GetTestCases() ` method
245+
246+ ** Example:**
122247
123248``` go
249+ package testcases
250+
251+ import (
252+ " context"
253+ " fmt"
254+
255+ " k8s.io/client-go/kubernetes"
256+ pkgtestcases " github.com/vllm-project/semantic-router/e2e/pkg/testcases"
257+ )
258+
124259func init () {
125- testcases .Register (" my-new-test " , testcases .TestCase {
126- Description: " My new test description " ,
127- Tags: []string {" ai-gateway " , " functional " },
128- Fn: testMyNewFeature ,
260+ pkgtestcases .Register (" my-feature " , pkgtestcases .TestCase {
261+ Description: " Test my new feature " ,
262+ Tags: []string {" functional " , " llm " },
263+ Fn: testMyFeature ,
129264 })
130265}
131266
132- func testMyNewFeature (ctx context .Context , client *kubernetes .Clientset , opts testcases .TestCaseOptions ) error {
133- // Test implementation
267+ func testMyFeature (ctx context .Context , client *kubernetes .Clientset , opts pkgtestcases .TestCaseOptions ) error {
268+ if opts.Verbose {
269+ fmt.Println (" [Test] Testing my feature" )
270+ }
271+
272+ // Setup service connection and get local port
273+ localPort , stopPortForward , err := setupServiceConnection (ctx, client, opts)
274+ if err != nil {
275+ return err
276+ }
277+ defer stopPortForward () // IMPORTANT: Always clean up port forwarding
278+
279+ // Test implementation using localPort
280+ url := fmt.Sprintf (" http://localhost:%s /v1/chat/completions" , localPort)
281+ // ... send requests and validate responses
282+
134283 return nil
135284}
136285```
137286
138- ### Adding a new profile
287+ ** Important Notes: **
139288
140- 1 . Create a new directory under ` profiles/ `
141- 2 . Implement the ` Profile ` interface
142- 3 . Register test cases
143- 4 . Update ` cmd/e2e/main.go ` to include the new profile
289+ - Always use ` defer stopPortForward() ` to clean up port forwarding
290+ - Use ` opts.ServiceConfig ` to get service connection details
291+ - Use ` opts.Verbose ` for debug logging
292+ - Load test data from ` e2e/testcases/testdata/ `
293+ - Use model name ` "MoM" ` in all requests
294+
295+ ### Adding a New Profile
296+
297+ Profiles define deployment environments and can reuse existing test cases.
298+
299+ ** Steps:**
300+
301+ 1 . Create a new directory under ` profiles/ ` (e.g., ` profiles/istio/ ` )
302+ 2 . Create ` profile.go ` implementing the ` Profile ` interface
303+ 3 . Implement required methods:
304+ - ` Setup() ` : Deploy environment
305+ - ` Teardown() ` : Clean up resources
306+ - ` GetTestCases() ` : Return list of test case names to run
307+ - ` GetServiceConfig() ` : Provide service configuration
308+ 4 . Import the ` testcases ` package to register test cases
309+ 5 . Update ` cmd/e2e/main.go ` to include the new profile
310+
311+ ** Example:**
312+
313+ ``` go
314+ package myprofile
315+
316+ import (
317+ " context"
318+
319+ " github.com/vllm-project/semantic-router/e2e/pkg/framework"
320+
321+ // Import testcases to register them
322+ _ " github.com/vllm-project/semantic-router/e2e/testcases"
323+ )
324+
325+ type Profile struct {
326+ verbose bool
327+ }
328+
329+ func NewProfile (verbose bool ) *Profile {
330+ return &Profile{verbose: verbose}
331+ }
332+
333+ func (p *Profile ) Name () string {
334+ return " my-profile"
335+ }
336+
337+ func (p *Profile ) Description () string {
338+ return " My custom deployment profile"
339+ }
340+
341+ func (p *Profile ) Setup (ctx context .Context , opts *framework .SetupOptions ) error {
342+ // Deploy your environment
343+ return nil
344+ }
345+
346+ func (p *Profile ) Teardown (ctx context .Context , opts *framework .TeardownOptions ) error {
347+ // Clean up resources
348+ return nil
349+ }
350+
351+ func (p *Profile ) GetTestCases () []string {
352+ return []string {
353+ " chat-completions-request" ,
354+ " domain-classify" ,
355+ // ... other test cases
356+ }
357+ }
358+
359+ func (p *Profile ) GetServiceConfig () framework .ServiceConfig {
360+ return framework.ServiceConfig {
361+ LabelSelector: " app=my-service" ,
362+ Namespace: " default" ,
363+ PortMapping: " 8080:80" ,
364+ }
365+ }
366+ ```
144367
145368See ` profiles/ai-gateway/ ` for a complete example.
0 commit comments