Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
35 changes: 21 additions & 14 deletions app/landing-page-quote.tsx
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
import { documentToHtmlString } from "@contentful/rich-text-html-renderer";
import { RiOpenaiFill } from "@remixicon/react";
import { kv } from "@vercel/kv";
import { OpenAIStream } from "ai";
import type { AllInOnePageQuery } from "gql/graphql";
import Link from "next/link";
import OpenAi from "openai";
Expand Down Expand Up @@ -84,12 +83,26 @@ Strikte Regeln
],
});

// Convert the response into a friendly text-stream
// @ts-expect-error TODO: migrate to new OpenAIStream API
const stream = OpenAIStream(response, {
async onCompletion(completion) {
await kv.set(cacheKey, completion);
await kv.expire(cacheKey, 60 * 60);
// Convert OpenAI stream to ReadableStream and handle completion callback
let fullCompletion = "";
const stream = new ReadableStream({
async start(controller) {
Comment on lines +87 to +89
Copy link

Copilot AI Nov 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The variable fullCompletion is declared in the outer scope but mutated inside the async stream handler. This pattern makes the variable's lifecycle and ownership unclear. Consider encapsulating this state within the start function's closure or documenting why it's scoped outside if there's a specific reason (though in this case, it doesn't need to be).

Suggested change
let fullCompletion = "";
const stream = new ReadableStream({
async start(controller) {
const stream = new ReadableStream({
async start(controller) {
let fullCompletion = "";

Copilot uses AI. Check for mistakes.
try {
for await (const chunk of response) {
const content = chunk.choices[0]?.delta?.content;
if (content) {
Comment on lines +92 to +93
Copy link

Copilot AI Nov 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The code uses optional chaining (chunk.choices[0]?.delta?.content) but doesn't handle the case where chunk.choices[0] exists but delta or content is null or undefined. This silently skips chunks without content. While this may be intentional (since OpenAI streams can include chunks without content), it would be clearer to add a comment explaining this behavior or to explicitly check choices.length > 0 for better readability.

Suggested change
const content = chunk.choices[0]?.delta?.content;
if (content) {
// OpenAI streams can include chunks without content; skip those.
if (
Array.isArray(chunk.choices) &&
chunk.choices.length > 0 &&
chunk.choices[0].delta &&
typeof chunk.choices[0].delta.content === "string"
) {
const content = chunk.choices[0].delta.content;

Copilot uses AI. Check for mistakes.
fullCompletion += content;
const encoder = new TextEncoder();
Comment on lines +90 to +95
Copy link

Copilot AI Nov 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The TextEncoder is instantiated inside the loop for each chunk. This creates unnecessary object allocations during streaming. Move the encoder instantiation outside the loop before the for await statement to improve performance.

Example:

const encoder = new TextEncoder();
for await (const chunk of response) {
  const content = chunk.choices[0]?.delta?.content;
  if (content) {
    fullCompletion += content;
    controller.enqueue(encoder.encode(content));
  }
}
Suggested change
try {
for await (const chunk of response) {
const content = chunk.choices[0]?.delta?.content;
if (content) {
fullCompletion += content;
const encoder = new TextEncoder();
const encoder = new TextEncoder();
try {
for await (const chunk of response) {
const content = chunk.choices[0]?.delta?.content;
if (content) {
fullCompletion += content;

Copilot uses AI. Check for mistakes.
controller.enqueue(encoder.encode(content));
}
}
// Cache the full completion when stream ends
await kv.set(cacheKey, fullCompletion);
await kv.expire(cacheKey, 60 * 60);
controller.close();
} catch (error) {
controller.error(error);
}
},
});

Expand All @@ -105,7 +118,6 @@ Strikte Regeln
);
}

const extractTextRegex = /"([^"]*)"/;
async function Reader({
reader,
}: {
Expand All @@ -119,15 +131,10 @@ async function Reader({
}

const text = new TextDecoder().decode(value);
const extractedText = text.match(extractTextRegex)?.[1]?.replace(
/\\n/g,
`
`
);

return (
<>
{extractedText}
{text}
Copy link

Copilot AI Nov 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The removal of the text extraction logic that previously parsed quoted strings and replaced \n with actual newlines may cause formatting issues. The old code handled escaped newlines from the AI response, but now raw text is displayed directly. If the OpenAI API returns escape sequences like \n as literal strings, they won't be converted to actual line breaks. Verify that the stream format has changed with the new openai package version, or restore the text processing logic if needed.

Copilot uses AI. Check for mistakes.
<Suspense>
<Reader reader={reader} />
</Suspense>
Expand Down
Loading
Loading