-
Notifications
You must be signed in to change notification settings - Fork 118
Description
Expected behavior
The local test server is started when you swift run
your Lambda function on your development machine.
By default, the server accepts POST /invoke
HTTP requests on port TCP 7000.
When an invocation is received, its is enqueued to be served to the Lambda Runtime on the next GET /next
request.
The local server is currently designed to receive one POST request at the time. It crashes with a fatalError when a POST /invoke
is received while the Lambda Runtime processes the previous invocation.
https://github.com/swift-server/swift-aws-lambda-runtime/blob/22f9f6d5e71db9067b4f9f8d16a16a6e2b8bb81c/Sources/AWSLambdaRuntime/Lambda%2BLocalServer.swift#L600
The LocalServer should gently deny the request or queue if for a later delivery to the runtime.
Without further investigation, the LocalServer SHOULD NOT deliver the invocation to the runtime while the runtime is processing the previous invocation. The Lambda runtime environment on AWS doesn't behaves like that (concurrent invocation of a Lambda function are handled by different micro VMs) and we never tested the Lambda Runtime for Swift with concurrent execution in mind.
Actual behavior
2025-10-14T09:41:25+0200 info LambdaRuntime : host="127.0.0.1" port=7000 [AWSLambdaRuntime] Server started and listening
AWSLambdaRuntime/Lambda+LocalServer.swift:600: Fatal error: Concurrent invocations to next(). This is illegal.
💣 Program crashed: System trap at 0x0000000193aaa22c
Platform: arm64 macOS 26.0.1 (25A362)
Thread 11 crashed:
0 closure #1 in closure #1 in closure #1 in LambdaHTTPServer.Pool.next() + 928 in StreamingNumbers at /Users/stormacq/Documents/amazon/code/swift/lambda/swift-aws-lambda-runtime/Examples/Streaming/.build/checkouts/swift-aws-lambda-runtime/Sources/AWSLambdaRuntime/Lambda+LocalServer.swift:600:29
598│
599│ case .continuation:
600│ fatalError("Concurrent invocations to next(). This is illegal.")
│ ▲
601│ }
602│ }
Steps to reproduce
Start a long running Lambda function.
git clone https://github.com/swift-server/swift-aws-lambda-runtime
cd swift-aws-lambda-runtime/Examples/Streaming
swift run
From another terminal, send multiple POST invoke
at the same time (the second one before the first one finishes processing)
curl -v --output test1.txt --header "Content-Type: application/json" --data '""' http://127.0.0.1:7000/invoke &
curl -v --output test2.txt --header "Content-Type: application/json" --data '""' http://127.0.0.1:7000/invoke &
If possible, minimal yet complete reproducer code (or URL to code)
n/a
What version of this project (swift-aws-lambda-runtime
) are you using?
2.0.0
Swift version
6.2
Amazon Linux 2 docker image version
n/a