Skip to content

Commit 48410de

Browse files
NiallJoeMaherclaude
andcommitted
docs: fix all chapter documentation to match actual code
- CHAPTER-0: Updated to show createUIMessageStream pattern, full useChat config with transport, correct systemPrompt signature - CHAPTER-2: Fixed agent types (UIMessageStreamWriter), gateway.languageModel pattern, tutor params (depth/context), route handler structure - CHAPTER-3: Rewrote quiz-master and planner to show artifact creation with dataStream.write(), correct models (artifact-model), DB save, error handling - CHAPTER-4: Fixed CustomUIDataTypes, added focusAreas param, artifact-model - CHAPTER-5: Added analyst.ts to file structure, fixed inputSchema usage, updated architecture diagrams, added analyst to orchestrator tools All code snippets are now copy-paste ready and match the actual implementation. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <[email protected]>
1 parent b768631 commit 48410de

File tree

5 files changed

+695
-389
lines changed

5 files changed

+695
-389
lines changed

CHAPTER-0.md

Lines changed: 111 additions & 45 deletions
Original file line numberDiff line numberDiff line change
@@ -88,33 +88,53 @@ The heart of the application is `/app/(chat)/api/chat/route.ts`. This is where m
8888
<summary>📄 <strong>Code: Basic Chat API Route</strong> (click to expand)</summary>
8989

9090
```typescript
91-
// app/(chat)/api/chat/route.ts (simplified)
92-
import { streamText } from "ai";
91+
// app/(chat)/api/chat/route.ts (core streaming logic)
92+
import {
93+
convertToModelMessages,
94+
createUIMessageStream,
95+
JsonToSseTransformStream,
96+
smoothStream,
97+
streamText,
98+
} from "ai";
99+
import { type RequestHints, systemPrompt } from "@/lib/ai/prompts";
93100
import { myProvider } from "@/lib/ai/providers";
94-
import { systemPrompt } from "@/lib/ai/prompts";
95101

96-
export async function POST(request: Request) {
97-
const { messages } = await request.json();
98-
99-
// Stream the AI response
100-
const result = streamText({
101-
model: myProvider.languageModel("chat-model"),
102-
system: systemPrompt(),
103-
messages,
104-
});
105-
106-
return result.toDataStreamResponse();
107-
}
102+
// Inside POST handler, after authentication and message loading:
103+
const stream = createUIMessageStream({
104+
execute: ({ writer: dataStream }) => {
105+
const result = streamText({
106+
model: myProvider.languageModel(selectedChatModel),
107+
system: systemPrompt({ selectedChatModel, requestHints }),
108+
messages: convertToModelMessages(uiMessages),
109+
experimental_transform: smoothStream({ chunking: "word" }),
110+
});
111+
112+
result.consumeStream();
113+
114+
dataStream.merge(
115+
result.toUIMessageStream({
116+
sendReasoning: true,
117+
})
118+
);
119+
},
120+
generateId: generateUUID,
121+
onFinish: async ({ messages }) => {
122+
// Save messages to database
123+
},
124+
});
125+
126+
return new Response(stream.pipeThrough(new JsonToSseTransformStream()));
108127
```
109128

110129
</details>
111130

112131
### Key Concepts
113132

114-
1. **`streamText`**: The AI SDK function that sends messages to the model and streams the response token by token.
115-
2. **`myProvider`**: Our configured AI provider (Claude Haiku via AI Gateway).
116-
3. **`systemPrompt`**: Instructions that tell the AI how to behave.
117-
4. **`toDataStreamResponse`**: Converts the stream into a format the frontend can consume.
133+
1. **`createUIMessageStream`**: Creates a stream that handles UI message updates with proper typing.
134+
2. **`streamText`**: The AI SDK function that sends messages to the model and streams the response.
135+
3. **`myProvider`**: Our configured AI provider (Claude Haiku via AI Gateway).
136+
4. **`systemPrompt`**: Function that builds instructions for the AI (takes model and location hints).
137+
5. **`JsonToSseTransformStream`**: Converts the stream into Server-Sent Events format for the frontend.
118138

119139
## How Streaming Works
120140

@@ -138,31 +158,51 @@ When you send a message:
138158

139159
## The Frontend Chat Hook
140160

141-
The frontend uses `useChat` from the AI SDK React package:
161+
The frontend uses `useChat` from the AI SDK React package with a custom transport configuration:
142162

143163
<details>
144164
<summary>📄 <strong>Code: useChat Hook Usage</strong> (click to expand)</summary>
145165

146166
```typescript
147-
// Simplified usage in a chat component
167+
// components/chat.tsx (key parts)
148168
import { useChat } from "@ai-sdk/react";
169+
import { DefaultChatTransport } from "@ai-sdk/react/internal";
170+
171+
export function Chat({ id, initialMessages, selectedChatModel }) {
172+
const {
173+
messages,
174+
setMessages,
175+
sendMessage,
176+
status,
177+
stop,
178+
regenerate,
179+
resumeStream,
180+
} = useChat<ChatMessage>({
181+
id,
182+
messages: initialMessages,
183+
experimental_throttle: 100,
184+
generateId: generateUUID,
185+
transport: new DefaultChatTransport({
186+
api: "/api/chat",
187+
fetch: fetchWithErrorHandlers,
188+
prepareSendMessagesRequest(request) {
189+
return {
190+
...request,
191+
body: {
192+
id,
193+
message: request.messages[request.messages.length - 1],
194+
selectedChatModel,
195+
selectedVisibilityType: visibilityType,
196+
},
197+
};
198+
},
199+
}),
200+
onFinish: () => {
201+
mutate("/api/history");
202+
},
203+
});
149204

150-
export function Chat() {
151-
const { messages, input, handleSubmit, handleInputChange } = useChat();
152-
153-
return (
154-
<div>
155-
{messages.map((message) => (
156-
<div key={message.id}>
157-
<strong>{message.role}:</strong> {message.content}
158-
</div>
159-
))}
160-
<form onSubmit={handleSubmit}>
161-
<input value={input} onChange={handleInputChange} />
162-
<button type="submit">Send</button>
163-
</form>
164-
</div>
165-
);
205+
// ... component JSX
166206
}
167207
```
168208

@@ -171,10 +211,10 @@ export function Chat() {
171211
> 💡 **React Parallel**: `useChat` is like combining `useState` for messages, `useReducer` for state transitions, and `useSWR` for the API call - all in one hook.
172212
173213
The `useChat` hook handles:
174-
- Managing message history
175-
- Sending messages to the API
176-
- Streaming response updates
177-
- Input state management
214+
- Managing message history with proper typing
215+
- Sending messages via custom transport
216+
- Streaming response updates with throttling
217+
- Request/response transformation
178218

179219
## Message Format
180220

@@ -192,17 +232,43 @@ type Message = {
192232

193233
## The System Prompt
194234

195-
The system prompt shapes the AI's personality and behavior:
235+
The system prompt shapes the AI's personality and behavior. It takes the selected model and geolocation hints as parameters:
196236

197237
<details>
198238
<summary>📄 <strong>Code: System Prompt</strong> (click to expand)</summary>
199239

200240
```typescript
201241
// lib/ai/prompts.ts
202-
export const systemPrompt = () => `
203-
You are a helpful AI assistant. Be concise and helpful.
204-
Today's date is ${new Date().toLocaleDateString()}.
242+
import type { Geo } from "@vercel/functions";
243+
244+
export const regularPrompt =
245+
"You are a friendly study buddy assistant! Keep your responses concise and helpful.";
246+
247+
export type RequestHints = {
248+
latitude: Geo["latitude"];
249+
longitude: Geo["longitude"];
250+
city: Geo["city"];
251+
country: Geo["country"];
252+
};
253+
254+
export const getRequestPromptFromHints = (requestHints: RequestHints) => `\
255+
About the origin of user's request:
256+
- lat: ${requestHints.latitude}
257+
- lon: ${requestHints.longitude}
258+
- city: ${requestHints.city}
259+
- country: ${requestHints.country}
205260
`;
261+
262+
export const systemPrompt = ({
263+
selectedChatModel,
264+
requestHints,
265+
}: {
266+
selectedChatModel: string;
267+
requestHints: RequestHints;
268+
}) => {
269+
const requestPrompt = getRequestPromptFromHints(requestHints);
270+
return `${regularPrompt}\n\n${requestPrompt}`;
271+
};
206272
```
207273

208274
</details>

0 commit comments

Comments
 (0)