ReadableStream not working in production, but is working locally (serverless functions keep-alive issue?) #77704
-
I'm working with the Client side: useEffect(() => {
const fetchStream = async () => {
const response = await fetch("/api/someRoute", {
method: "POST",
cache: "no-cache",
keepalive: true,
headers: {
"Content-Type": "text/plain; charset=utf-8",
},
body: JSON.stringify({
//...
}),
});
const reader = response.body?.getReader();
if (reader) {
while (true) {
const { done, value } = await reader.read();
const jsonValue = new TextDecoder().decode(value);
console.log({ done, jsonValue });
if (jsonValue.length) {
const j = JSON.parse(jsonValue);
setJson(jsonValue);
}
if (done) {
console.log("STREAM DONE");
// by now, `json` is filled with valid data
break;
}
}
}
};
fetchStream();
}, []); Server side: export async function POST(request: NextRequest) {
const body: RequestBody = await request.json();
console.log({ body });
const prompt = getPrompt(body);
console.log({ prompt });
const responseHeaders = new Headers();
responseHeaders.set("Cache-Control", "no-cache");
responseHeaders.set("Connection", "keep-alive");
responseHeaders.set("Content-Type", "text/plain; charset=utf-8");
const { partialObjectStream } = await streamObject({
model: google("gemini-2.0-flash-lite-preview-02-05", {}),
schema,
schemaName: "...",
prompt,
mode: "json",
});
const encoder = new TextEncoder();
const stream = new ReadableStream({
async start(controller) {
for await (const partialObject of partialObjectStream) {
console.log(partialObject);
const chunk = encoder.encode(JSON.stringify(partialObject));
controller.enqueue(chunk);
}
controller.close();
},
});
return new NextResponse(stream, {
headers: responseHeaders,
});
} I'm wondering if there's a header or some setting I'm not setting or if something like this isn't possible with Vercel. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
I guess, it also works fine when building locally, and starting the server. You don't see any data client side? None of your logs neither? I wonder if, Fluid Compute would help here, or just increasing the maxDuration of the functions. |
Beta Was this translation helpful? Give feedback.
Ah! Spent several hours on this issue and then remembered the Google Generative AI Provider uses the
GOOGLE_GENERATIVE_AI_API_KEY
env variable which I had locally but hadn't added to my Vercel project in prod 😭