Skip to content

Commit d6e29b0

Browse files
Apply suggestions from code review
Co-authored-by: db-cloudflare <[email protected]>
1 parent 55852e4 commit d6e29b0

File tree

1 file changed

+56
-58
lines changed

1 file changed

+56
-58
lines changed

src/content/docs/workers-ai/tutorials/build-ai-interview-practice-tool.mdx

Lines changed: 56 additions & 58 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ tags:
1616

1717
import { Render, PackageManagers } from "~/components";
1818

19-
Job interviews can be stressful, and practice is key to building confidence. While traditional mock interviews with friends or mentors are valuable, they aren't always available when you need them. In this tutorial, you'll learn how to build an AI-powered Interview Practice Tool that provides real-time feedback and helps improve interview skills.
19+
Job interviews can be stressful, and practice is key to building confidence. While traditional mock interviews with friends or mentors are valuable, they are not always available when you need them. In this tutorial, you will learn how to build an AI-powered interview practice tool that provides real-time feedback to help improve interview skills.
2020

2121
By the end of this tutorial, you will have built a complete interview practice tool with the following core functionalities:
2222

@@ -36,7 +36,7 @@ This tutorial demonstrates how to use multiple Cloudflare products and while man
3636

3737
## 1. Create a new Worker project
3838

39-
As the first step, create a Cloudflare Workers project using the Create Cloudflare CLI (C3) tool and the Hono framework.
39+
Create a Cloudflare Workers project using the Create Cloudflare CLI (C3) tool and the Hono framework.
4040

4141
:::note
4242
[Hono](https://hono.dev) is a lightweight web framework that helps build API endpoints and handle HTTP requests. This tutorial uses Hono to create and manage the application's routing and middleware components.
@@ -76,9 +76,9 @@ npx wrangler dev
7676
When you run `wrangler dev`, the command starts a local development server and provides a `localhost` URL where you can preview your application.
7777
You can now make changes to your code and see them reflected in real-time at the provided localhost address.
7878

79-
### Define TypeScript types for the interview system
79+
## 2. Define TypeScript types for the interview system
8080

81-
Now that the project is set up, let's create the TypeScript types that will form the foundation of the interview system. These types will help you maintain type safety and provide clear interfaces for the different components of your application.
81+
Now that the project is set up, create the TypeScript types that will form the foundation of the interview system. These types will help you maintain type safety and provide clear interfaces for the different components of your application.
8282

8383
Create a new file `types.ts` that will contain essential types and enums for:
8484

@@ -102,7 +102,7 @@ export interface ApiContext {
102102
export type HonoCtx = Context<ApiContext>;
103103

104104
// List of technical skills you can assess during mock interviews.
105-
// Let's focus on popular web technologies and programming languages
105+
// This application focuses on popular web technologies and programming languages
106106
// that are commonly tested in real interviews.
107107
export enum InterviewSkill {
108108
JavaScript = "JavaScript",
@@ -168,7 +168,7 @@ export interface InterviewInput {
168168
}
169169
```
170170

171-
### Configure Error Types for Different Services
171+
## 3. Configure error types for different services
172172

173173
Next, set up custom error types to handle different kinds of errors that may occur in your application. This includes:
174174

@@ -226,17 +226,17 @@ export class InterviewError extends Error {
226226
}
227227
```
228228

229-
## 2. Configure Authentication Middleware and User Route
229+
## 4. Configure authentication middleware and user routes
230230

231-
First, you will implement a basic authentication system to track and identify users interacting with your AI interview practice tool. The system uses HTTP-only cookies to store usernames, allowing you to identify both the request sender and their corresponding Durable Object. This straightforward authentication approach requires users to provide a username, which is then stored securely in a cookie. This approach allows you to:
231+
In this step, you will implement a basic authentication system to track and identify users interacting with your AI interview practice tool. The system uses HTTP-only cookies to store usernames, allowing you to identify both the request sender and their corresponding Durable Object. This straightforward authentication approach requires users to provide a username, which is then stored securely in a cookie. This approach allows you to:
232232

233233
- Identify users across requests
234234
- Associate interview sessions with specific users
235235
- Secure access to interview-related endpoints
236236

237237
### Create the Authentication Middleware
238238

239-
First, create a middleware function that will check for the presence of a valid authentication cookie. This middleware will be used to protect routes that require authentication.
239+
Create a middleware function that will check for the presence of a valid authentication cookie. This middleware will be used to protect routes that require authentication.
240240

241241
Create a new middleware file `middleware/auth.ts`:
242242

@@ -328,7 +328,7 @@ const app = new Hono<ApiContext>();
328328
const api = new Hono<ApiContext>();
329329

330330
// Set up global middleware that runs on every request
331-
// - Logger gives us visibility into what's happening
331+
// - Logger gives us visibility into what is happening
332332
app.use("*", logger());
333333
app.use("*", requireAuth);
334334

@@ -337,7 +337,7 @@ app.use("*", requireAuth);
337337
api.route("/auth", configureAuthRoutes());
338338

339339
// Mount all API routes under the version prefix (for example, /api/v1)
340-
// This lets us make breaking changes in v2 without affecting v1 users
340+
// This allows us to make breaking changes in v2 without affecting v1 users
341341
app.route("/api/v1", api);
342342

343343
export default app;
@@ -349,17 +349,17 @@ Now we have a basic authentication system that:
349349
2. Securely stores the username in a cookie
350350
3. Includes middleware to protect authenticated routes
351351

352-
## 3. Create a Durable Object to Manage Interview
352+
## 5. Create a Durable Object to manage interviews
353353

354-
Now that you have your authentication system in place, create a Durable Object to manage interview sessions. Durable Objects are perfect for the interview practice tool because they provide:
354+
Now that you have your authentication system in place, create a Durable Object to manage interview sessions. Durable Objects are perfect for this interview practice tool because they provide the following functionalities:
355355

356-
- Maintains state between connections, so users can reconnect without losing progress
357-
- Comes with a powerful SQLite database to store all interview Q&A, feedback and metrics
358-
- Enables smooth real-time interactions between the interviewer AI and candidate
359-
- Handles multiple interview sessions efficiently without performance issues
360-
- Creates a dedicated instance for each user, giving them their own isolated environment
356+
- Maintains states between connections, so users can reconnect without losing progress.
357+
- Provides a SQLite database to store all interview Q&A, feedback and metrics.
358+
- Enables smooth real-time interactions between the interviewer AI and candidate.
359+
- Handles multiple interview sessions efficiently without performance issues.
360+
- Creates a dedicated instance for each user, giving them their own isolated environment.
361361

362-
First, let's configure the Durable Object in `wrangler.toml` file. Add the following configuration:
362+
First, you will need to configure the Durable Object in `wrangler.toml` file. Add the following configuration:
363363

364364
```toml title="wrangler.toml"
365365
[[durable_objects.bindings]]
@@ -383,7 +383,7 @@ export class Interview extends DurableObject<CloudflareBindings> {
383383
constructor(state: DurableObjectState, env: CloudflareBindings) {
384384
super(state, env);
385385

386-
// Initialize empty sessions map - we'll add WebSocket connections as users join
386+
// Initialize empty sessions map - we will add WebSocket connections as users join
387387
this.sessions = new Map();
388388
}
389389

@@ -429,14 +429,14 @@ Since the Worker code is written in TypeScript, you should run the following com
429429
npm run cf-typegen
430430
```
431431

432-
### Set up SQLite Database Schema
432+
### Set up SQLite database schema to store interview data
433433

434-
Let's use SQLite at the Durable Object level for data persistence. This gives each user their own isolated database instance. You'll need two main tables:
434+
Now you will use SQLite at the Durable Object level for data persistence. This gives each user their own isolated database instance. You will need two main tables:
435435

436436
- `interviews`: Stores interview session data
437437
- `messages`: Stores all messages exchanged during interviews
438438

439-
But before you create these tables, create a service class to handle your database operations. This encapsulates database logic and helps you:
439+
Before you create these tables, create a service class to handle your database operations. This encapsulates database logic and helps you:
440440

441441
- Manage database schema changes
442442
- Handle errors consistently
@@ -470,7 +470,7 @@ export class InterviewDatabaseService {
470470
constructor(private sql: SqlStorage) {}
471471

472472
/**
473-
* Sets up the database schema by creating tables and indexes if they don't exist.
473+
* Sets up the database schema by creating tables and indexes if they do not exist.
474474
* This is called when initializing a new Durable Object instance to ensure
475475
* we have the required database structure.
476476
*
@@ -486,7 +486,7 @@ export class InterviewDatabaseService {
486486
const existingTables = new Set([...cursor].map((table) => table.name));
487487

488488
// The interviews table is our main table storing interview sessions.
489-
// We only create it if it doesn't exist yet.
489+
// We only create it if it does not exist yet.
490490
if (!existingTables.has(CONFIG.database.tables.interviews)) {
491491
this.sql.exec(InterviewDatabaseService.QUERIES.CREATE_INTERVIEWS_TABLE);
492492
}
@@ -498,7 +498,7 @@ export class InterviewDatabaseService {
498498
}
499499

500500
// Add an index on interviewId to speed up message retrieval.
501-
// This is important since we'll frequently query messages by interview.
501+
// This is important since we will frequently query messages by interview.
502502
this.sql.exec(InterviewDatabaseService.QUERIES.CREATE_MESSAGE_INDEX);
503503
} catch (error: unknown) {
504504
const message = error instanceof Error ? error.message : String(error);
@@ -537,7 +537,7 @@ export class InterviewDatabaseService {
537537
}
538538
```
539539

540-
Update your Interview Durable Object to use the database service by modifying `src/interview.ts`:
540+
Update the `Interview` Durable Object to use the database service by modifying `src/interview.ts`:
541541

542542
```typescript title="src/interview.ts"
543543
import { InterviewDatabaseService } from "./services/InterviewDatabaseService";
@@ -782,7 +782,7 @@ export class InterviewDatabaseService {
782782
}
783783
```
784784

785-
Add RPC methods to Interview Durable Object to expose database operations through API. Add this code to `src/interview.ts`:
785+
Add RPC methods to the `Interview` Durable Object to expose database operations through API. Add this code to `src/interview.ts`:
786786

787787
```typescript title="src/interview.ts"
788788
import {
@@ -827,9 +827,9 @@ export class Interview extends DurableObject<CloudflareBindings> {
827827
}
828828
```
829829

830-
### Create REST API Endpoints
830+
## 6. Create REST API endpoints
831831

832-
With your Durable Object and database service ready, create REST API endpoints to manage interviews. You'll need endpoints to:
832+
With your Durable Object and database service ready, create REST API endpoints to manage interviews. You will need endpoints to:
833833

834834
- Create new interviews
835835
- Retrieve all interviews for a user
@@ -920,9 +920,7 @@ export const configureInterviewRoutes = () => {
920920
};
921921
```
922922

923-
:::note
924923
The `getInterviewDO` helper function uses the username from our authentication cookie to create a unique Durable Object ID. This ensures each user has their own isolated interview state.
925-
:::
926924

927925
Update your main application file to include the routes and protect them with authentication middleware. Update `src/index.ts`:
928926

@@ -971,11 +969,11 @@ curl http://localhost:8787/api/v1/interviews \
971969
-H "Cookie: username=testuser; HttpOnly"
972970
```
973971

974-
### Set up WebSocket handler
972+
## 7. Set up WebSockets to handle real-time communication
975973

976-
Now that you have your basic interview management system in place, let's add real-time communication capabilities using WebSockets. You'll use Durable Objects to maintain WebSocket connections and handle real-time message processing.
974+
With the basic interview management system in place, you will now implement Durable Objects to handle real-time message processing and maintain WebSocket connections.
977975

978-
First, update Interview Durable Object to handle WebSocket connections. Update `src/interview.ts`:
976+
Update the `Interview` Durable Object to handle WebSocket connections by adding the following code to `src/interview.ts`:
979977

980978
```typescript
981979
export class Interview extends DurableObject<CloudflareBindings> {
@@ -1000,7 +998,7 @@ export class Interview extends DurableObject<CloudflareBindings> {
1000998
return this.handleWebSocketUpgrade(request);
1001999
}
10021000

1003-
// If it's not a WebSocket request, we don't handle it
1001+
// If it is not a WebSocket request, we don't handle it
10041002
return new Response("Not found", { status: 404 });
10051003
}
10061004

@@ -1084,7 +1082,7 @@ The WebSocket system provides real-time communication features for interview pra
10841082
- To keep connections stable, it automatically responds to ping messages with pongs, preventing timeouts
10851083
- Candidates and interviewers receive instant updates as the interview progresses, creating a natural conversational flow
10861084

1087-
## 4. Add audio processing capabilities with Workers AI
1085+
## 8. Add audio processing capabilities with Workers AI
10881086

10891087
Now that WebSocket connection set up, the next step is to add speech-to-text capabilities using Workers AI. Let's use Cloudflare's Whisper model to transcribe audio in real-time during the interview.
10901088

@@ -1097,17 +1095,17 @@ The audio processing pipeline will work like this:
10971095
5. We immediately send the transcription back to the client
10981096
6. The client receives a notification that the AI interviewer is generating a response
10991097

1100-
### Create Audio Processing Pipeline
1098+
### Create audio processing pipeline
11011099

1102-
Now let's update Interview Durable Object to handle binary audio data:
1100+
In this step you will update the Interview Durable Object to handle the following:
11031101

1104-
1. Detects binary audio data sent through WebSocket
1105-
2. Creates a unique message ID for tracking the processing status
1106-
3. Notifies clients that audio processing has begun
1107-
4. Includes error handling for failed audio processing
1108-
5. Broadcasts status updates to all connected clients
1102+
1. Detect binary audio data sent through WebSocket
1103+
2. Create a unique message ID for tracking the processing status
1104+
3. Notify clients that audio processing has begun
1105+
4. Include error handling for failed audio processing
1106+
5. Broadcast status updates to all connected clients
11091107

1110-
First, let's update Interview Durable Object to handle binary WebSocket messages. Add the following methods to your `src/interview.ts` file:
1108+
First, update Interview Durable Object to handle binary WebSocket messages. Add the following methods to your `src/interview.ts` file:
11111109

11121110
```typescript title="src/interview.ts"
11131111
// ... previous code ...
@@ -1189,23 +1187,23 @@ Your `handleBinaryAudio` method currently logs when it receives audio data. Next
11891187

11901188
### Configure speech-to-text
11911189

1192-
Now that audio processing pipeline is set up, let's integrate Workers AI's Whisper model for speech-to-text transcription.
1190+
Now that audio processing pipeline is set up, you will now integrate Workers AI's Whisper model for speech-to-text transcription.
11931191

1194-
First, configure the Worker AI binding in your `wrangler.toml` file by adding:
1192+
Configure the Worker AI binding in your `wrangler.toml` file by adding:
11951193

11961194
```toml
11971195
# ... previous configuration ...
11981196
[ai]
11991197
binding = "AI"
12001198
```
12011199

1202-
Next, you need to generate TypeScript types for our AI binding. Run the following command:
1200+
Next, generate TypeScript types for our AI binding. Run the following command:
12031201

12041202
```sh
12051203
npm run cf-typegen
12061204
```
12071205

1208-
Now, let's create a new service class for AI operations. Create a new file called `services/AIService.ts`:
1206+
You will need a new service class for AI operations. Create a new file called `services/AIService.ts`:
12091207

12101208
```typescript title="src/services/AIService.ts"
12111209
import { InterviewError, ErrorCodes } from "../errors";
@@ -1235,7 +1233,7 @@ export class AIService {
12351233
}
12361234
```
12371235

1238-
Now, let's update Interview Durable Object to use this new AI service. Update the handleBinaryAudio method in `src/interview.ts`:
1236+
You will need to update the `Interview` Durable Object to use this new AI service. To do this, update the handleBinaryAudio method in `src/interview.ts`:
12391237

12401238
```typescript title="src/interview.ts"
12411239
import { AIService } from "./services/AIService";
@@ -1290,9 +1288,9 @@ private async handleBinaryAudio(ws: WebSocket, audioData: ArrayBuffer): Promise<
12901288
The Whisper model `@cf/openai/whisper-tiny-en` is optimized for English speech recognition. If you need support for other languages, you can use different Whisper model variants available through Workers AI.
12911289
:::
12921290
1293-
Now when users speak during the interview, their audio will be automatically transcribed and stored as messages in the interview session. The transcribed text will be immediately available to both the user and the AI interviewer for generating appropriate responses.
1291+
When users speak during the interview, their audio will be automatically transcribed and stored as messages in the interview session. The transcribed text will be immediately available to both the user and the AI interviewer for generating appropriate responses.
12941292
1295-
## 5. Integrate AI response generation
1293+
## 9. Integrate AI response generation
12961294
12971295
Now that you have audio transcription working, let's implement AI interviewer response generation using Workers AI's LLM capabilities. You'll create an interview system that:
12981296
@@ -1301,9 +1299,9 @@ Now that you have audio transcription working, let's implement AI interviewer re
13011299
- Gives constructive feedback
13021300
- Stays in character as a professional interviewer
13031301
1304-
### 5.1. Set up Workers AI LLM integration
1302+
### Set up Workers AI LLM integration
13051303
1306-
First, let's enhance `AIService` class to handle LLM interactions. You will need to add methods for:
1304+
First, update the `AIService` class to handle LLM interactions. You will need to add methods for:
13071305
13081306
- Processing interview context
13091307
- Generating appropriate responses
@@ -1354,7 +1352,7 @@ private prepareLLMMessages(interview: InterviewData) {
13541352
The @cf/meta/llama-2-7b-chat-int8 model is optimized for chat-like interactions and provides good performance while maintaining reasonable resource usage.
13551353
:::
13561354
1357-
### 5.2. Create conversation prompt
1355+
### Create the conversation prompt
13581356
13591357
Prompt engineering is crucial for getting high-quality responses from the LLM. Next, you will create a system prompt that:
13601358
@@ -1376,9 +1374,9 @@ private createSystemPrompt(interview: InterviewData): string {
13761374
}
13771375
```
13781376
1379-
### 5.3. Implement response generation logic
1377+
### Implement response generation logic
13801378
1381-
Finally, let's integrate the LLM response generation into the interview flow. Update the `handleBinaryAudio` method in the `src/interview.ts` Durable Object to:
1379+
Finally, integrate the LLM response generation into the interview flow. Update the `handleBinaryAudio` method in the `src/interview.ts` Durable Object to:
13821380
13831381
- Process transcribed user responses
13841382
- Generate appropriate AI interviewer responses
@@ -1454,7 +1452,7 @@ private async handleBinaryAudio(ws: WebSocket, audioData: ArrayBuffer): Promise<
14541452
14551453
## Conclusion
14561454
1457-
You've successfully built an AI-powered interview practice tool using Cloudflare's Workers AI. Let's review what you've accomplished:
1455+
You have successfully built an AI-powered interview practice tool using Cloudflare's Workers AI. In summary, you have:
14581456
14591457
- Created a real-time WebSocket communication system using Durable Objects
14601458
- Implemented speech-to-text processing with Workers AI Whisper model

0 commit comments

Comments
 (0)