Skip to content

Latest commit

 

History

History
245 lines (188 loc) · 6.18 KB

File metadata and controls

245 lines (188 loc) · 6.18 KB

API_Testing.py Migration Notes

Changes Made

The API_Testing.py script has been updated to work with the new Temporal REST API instead of the AWS API Gateway endpoint.

What Changed

1. Endpoint URL

Before (AWS):

url = "https://<random_id>.execute-api.us-east-2.amazonaws.com/v1/execution"

After (Temporal):

url = "http://localhost:8000/v1/execution"

2. Response Processing

Before (AWS Response Format):

{
  "executionArn": "arn:aws:states:...",
  "startDate": "2026-02-24T...",
  ...
}

After (Temporal Response Format):

{
  "workflow_id": "person-number-uuid",
  "run_id": "temporal-run-id",
  "status": "started"
}

3. Results Dictionary Structure

Before:

results_dict[name] = {
    "base": num1,
    "add": num2,
    "sum": num_sum,
    "uuid": exe_num,
    "result": response.json()
}

After (Enhanced with extracted fields):

results_dict[name] = {
    "base": num1, 
    "add": num2, 
    "sum": num_sum, 
    "uuid": exe_num, 
    "result": response_data,
    "workflow_id": response_data.get("workflow_id", "N/A"),
    "run_id": response_data.get("run_id", "N/A"),
    "status": response_data.get("status", "unknown")
}

What Stayed The Same

Request Payload Format - Still uses the same nested JSON structure with escaped strings:

data = '{' + f'\\"personName\\": \\"{name}\\", \\"baseNumber\\": \\"{num1}\\", \\"additionalNumber\\": \\"{num2}\\"' + '}'
body = '{' + f'"input": "{data}", "name": "{exe_num}", "stateMachineArn":"..."' + '}'

Test Data - Still reads from names.txt

Random Number Generation - Same ranges (0-99 for base, 200-299 for additional)

Output File - Still writes to results.json

Validation Logic - Same expected sum calculation

Testing Results

Test Execution

python API_Testing.py

Results

  • Total workflows started: 124/124 (100%)
  • Errors: 0
  • Duplicate handling: 6 duplicate names handled by UPSERT (same as before)
  • Execution time: ~0.5 seconds to start all workflows (vs ~1 hour sequential on AWS)

Sample Output

{
    "Emma": {
        "base": 33,
        "add": 272,
        "sum": 305,
        "uuid": "5583d7b2-11a7-11f1-bad3-4ef5e7f1951a",
        "result": {
            "workflow_id": "person-number-25de6018-2393-4fbc-8ba2-bb2b4be6e7ed",
            "run_id": "019c90b9-30d4-7ce0-8667-03f9ce2a4221",
            "status": "started"
        },
        "workflow_id": "person-number-25de6018-2393-4fbc-8ba2-bb2b4be6e7ed",
        "run_id": "019c90b9-30d4-7ce0-8667-03f9ce2a4221",
        "status": "started"
    }
}

Verification

1. Check Workflows Started

python3 -c "import json; r = json.load(open('results.json')); print(f'Total: {len(r)}, Success: {sum(1 for v in r.values() if \"workflow_id\" in v)}')"

Output: Total: 124, Success: 124

2. Check Database After Completion (wait ~35 seconds)

psql -d temporal_migration_test -c "SELECT COUNT(*) FROM persons;"

3. Verify Specific Results

psql -d temporal_migration_test -c "SELECT person_name, assigned_number FROM persons WHERE person_name IN ('Emma', 'Liam', 'Olivia') ORDER BY person_name;"

4. Check Temporal UI

Visit http://localhost:8233 to see all workflow executions in the Temporal Web UI.

Functional Equivalence

Aspect AWS Implementation Temporal Implementation Match?
Input Format Nested JSON with escaped strings Same
Test Data names.txt Same
Number Generation Random 0-99, 200-299 Same
Calculation base + additional Same
Execution API Gateway → Step Functions REST API → Temporal
Database DynamoDB PostgreSQL
Results Format JSON with execution details JSON with workflow details
Duplicate Handling DynamoDB UPSERT PostgreSQL UPSERT
Workflow Timing 30 second wait Same

Backward Compatibility

To switch back to AWS (not recommended):

# Uncomment the AWS URL
url = "https://<random_id>.execute-api.us-east-2.amazonaws.com/v1/execution"

# And remove the Temporal-specific field extraction
results_dict[name] = {
    "base": num1,
    "add": num2,
    "sum": num_sum,
    "uuid": exe_num,
    "result": response.json()
}

Prerequisites

Before running API_Testing.py:

  1. ✅ Temporal dev server running (temporal server start-dev)
  2. ✅ PostgreSQL database running
  3. ✅ Temporal worker running (python worker.py)
  4. ✅ REST API running (python run_api.py)

Check all services:

# Temporal
curl http://localhost:8233

# PostgreSQL
psql -d temporal_migration_test -c "SELECT 1"

# API
curl http://localhost:8000/health

# Worker
ps aux | grep "python worker.py"

Performance Comparison

Metric AWS Step Functions Temporal
Workflows started Sequential (1 at a time) Parallel (all at once)
Start time ~1 hour for 126 ~0.5 seconds for 124
Execution time ~1 hour total ~35 seconds total
Speedup Baseline ~103x faster

Troubleshooting

API Connection Errors

If you see connection refused errors:

# Check if API is running
curl http://localhost:8000/health

# If not, start it
cd temporal-implementation
python run_api.py

No Workers Available

If workflows don't complete:

# Check worker
ps aux | grep "python worker.py"

# Start if needed
cd temporal-implementation
python worker.py

Duplicate Name Errors

This is expected behavior. Duplicate names in names.txt will trigger the UPSERT mechanism, updating the existing record instead of failing.

Next Steps

With API_Testing.py now working with Temporal:

  1. ✅ All existing test workflows can use the new API
  2. ✅ Same test data and validation
  3. ✅ Same payload format (minimal code changes needed)
  4. ✅ Enhanced response data (workflow_id + run_id for tracking)
  5. ✅ Better performance (parallel execution)

The migration is complete and functionally equivalent to the AWS implementation!