Skip to content

Comments

Update dependencies and refactor OpenAI stream handling#216

Merged
mdugue merged 1 commit intomainfrom
update-ai-dependencies
Nov 20, 2025
Merged

Update dependencies and refactor OpenAI stream handling#216
mdugue merged 1 commit intomainfrom
update-ai-dependencies

Conversation

@mdugue
Copy link
Owner

@mdugue mdugue commented Nov 20, 2025

  • Updated various dependencies in package.json and bun.lock, including @react-spring/web, ai, and openai.
  • Refactored the OpenAI stream handling in landing-page-quote.tsx to utilize a new ReadableStream implementation for better performance and error handling.
  • Adjusted the linting command order in package.json for improved consistency.

- Updated various dependencies in `package.json` and `bun.lock`, including `@react-spring/web`, `ai`, and `openai`.
- Refactored the OpenAI stream handling in `landing-page-quote.tsx` to utilize a new `ReadableStream` implementation for better performance and error handling.
- Adjusted the linting command order in `package.json` for improved consistency.
@mdugue mdugue requested a review from Copilot November 20, 2025 07:47
@vercel
Copy link

vercel bot commented Nov 20, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
manuel-dugue Ready Ready Preview Comment Nov 20, 2025 7:48am

@mdugue mdugue merged commit 6bdfcf3 into main Nov 20, 2025
9 checks passed
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR updates several key dependencies and refactors the OpenAI streaming implementation to use native ReadableStream instead of the deprecated OpenAIStream helper from the ai package.

Key Changes

  • Updated ai package from 3.1.12 to 5.0.97 (major version upgrade)
  • Updated openai package from 5.12.2 to 6.9.1 (major version upgrade)
  • Updated @react-spring/web from 9.7.5 to 10.0.3 (major version upgrade)
  • Replaced OpenAIStream helper with custom ReadableStream implementation
  • Removed text extraction regex that handled quoted strings and escape sequences

Reviewed Changes

Copilot reviewed 2 out of 3 changed files in this pull request and generated 4 comments.

File Description
package.json Updated dependency versions for ai, openai, @react-spring/web, and @vercel/node; reordered lint commands to run Biome before ESLint
bun.lock Lockfile updates reflecting the new dependency versions and their transitive dependencies
app/landing-page-quote.tsx Refactored from using OpenAIStream helper to custom ReadableStream implementation with manual chunk processing; removed unused import and text extraction regex

Comment on lines +90 to +95
try {
for await (const chunk of response) {
const content = chunk.choices[0]?.delta?.content;
if (content) {
fullCompletion += content;
const encoder = new TextEncoder();
Copy link

Copilot AI Nov 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The TextEncoder is instantiated inside the loop for each chunk. This creates unnecessary object allocations during streaming. Move the encoder instantiation outside the loop before the for await statement to improve performance.

Example:

const encoder = new TextEncoder();
for await (const chunk of response) {
  const content = chunk.choices[0]?.delta?.content;
  if (content) {
    fullCompletion += content;
    controller.enqueue(encoder.encode(content));
  }
}
Suggested change
try {
for await (const chunk of response) {
const content = chunk.choices[0]?.delta?.content;
if (content) {
fullCompletion += content;
const encoder = new TextEncoder();
const encoder = new TextEncoder();
try {
for await (const chunk of response) {
const content = chunk.choices[0]?.delta?.content;
if (content) {
fullCompletion += content;

Copilot uses AI. Check for mistakes.
Comment on lines +87 to +89
let fullCompletion = "";
const stream = new ReadableStream({
async start(controller) {
Copy link

Copilot AI Nov 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The variable fullCompletion is declared in the outer scope but mutated inside the async stream handler. This pattern makes the variable's lifecycle and ownership unclear. Consider encapsulating this state within the start function's closure or documenting why it's scoped outside if there's a specific reason (though in this case, it doesn't need to be).

Suggested change
let fullCompletion = "";
const stream = new ReadableStream({
async start(controller) {
const stream = new ReadableStream({
async start(controller) {
let fullCompletion = "";

Copilot uses AI. Check for mistakes.
Comment on lines +92 to +93
const content = chunk.choices[0]?.delta?.content;
if (content) {
Copy link

Copilot AI Nov 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The code uses optional chaining (chunk.choices[0]?.delta?.content) but doesn't handle the case where chunk.choices[0] exists but delta or content is null or undefined. This silently skips chunks without content. While this may be intentional (since OpenAI streams can include chunks without content), it would be clearer to add a comment explaining this behavior or to explicitly check choices.length > 0 for better readability.

Suggested change
const content = chunk.choices[0]?.delta?.content;
if (content) {
// OpenAI streams can include chunks without content; skip those.
if (
Array.isArray(chunk.choices) &&
chunk.choices.length > 0 &&
chunk.choices[0].delta &&
typeof chunk.choices[0].delta.content === "string"
) {
const content = chunk.choices[0].delta.content;

Copilot uses AI. Check for mistakes.
return (
<>
{extractedText}
{text}
Copy link

Copilot AI Nov 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The removal of the text extraction logic that previously parsed quoted strings and replaced \n with actual newlines may cause formatting issues. The old code handled escaped newlines from the AI response, but now raw text is displayed directly. If the OpenAI API returns escape sequences like \n as literal strings, they won't be converted to actual line breaks. Verify that the stream format has changed with the new openai package version, or restore the text processing logic if needed.

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant