Skip to content

Commit 639c15d

Browse files
committed
chore: types and docstring
1 parent 151aa66 commit 639c15d

File tree

12 files changed

+251
-231
lines changed

12 files changed

+251
-231
lines changed

docs/features/batch.md

Lines changed: 71 additions & 47 deletions
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ If your function fails to process any message from the batch, the entire batch r
2828
This behavior changes when you enable the [ReportBatchItemFailures feature](https://docs.aws.amazon.com/lambda/latest/dg/services-sqs-errorhandling.html#services-sqs-batchfailurereporting) in your Lambda function event source configuration:
2929

3030
* [**SQS queues**](#sqs-standard). Only messages reported as failure will return to the queue for a retry, while successful ones will be deleted.
31-
* [**Kinesis data streams**](#kinesis-and-dynamodb-streams) and [**DynamoDB streams**](#kinesis-and-dynamodb-streams). Single reported failure will use its sequence number as the stream checkpoint. Multiple reported failures will use the lowest sequence number as checkpoint.
31+
* [**Kinesis data streams**](#kinesis-and-dynamodb-streams) and [**DynamoDB streams**](#kinesis-and-dynamodb-streams). Single reported failure will use its sequence number as the stream checkpoint. Multiple reported failures will use the lowest sequence number as checkpoint.
3232

3333
<!-- HTML tags are required in admonition content thus increasing line length beyond our limits -->
3434
<!-- markdownlint-disable MD013 -->
@@ -213,7 +213,7 @@ By default, we catch any exception raised by your record handler function. This
213213

214214
1. Any exception works here. See [extending `BatchProcessor` section, if you want to override this behavior.](#extending-batchprocessor)
215215

216-
2. Exceptions raised in `recordHandler` will propagate to `process_partial_response`. <br/><br/> We catch them and include each failed batch item identifier in the response dictionary (see `Sample response` tab).
216+
2. Exceptions raised in `recordHandler` will propagate to `processPartialResponse`. <br/><br/> We catch them and include each failed batch item identifier in the response dictionary (see `Sample response` tab).
217217

218218
=== "Sample response"
219219

@@ -296,81 +296,105 @@ The behavior changes slightly when there are multiple item failures. Stream chec
296296

297297
### Parser integration
298298

299-
Thanks to the [Parser utility](./parser.md) integration, you can pass a [Standard Schema](https://standardschema.dev){target="_blank"}-compatible schema when instantiating the `BatchProcessor` and we will use it to validate each item in the batch before passing it to your record handler.
299+
The Batch Processing utility integrates with the [Parser utility](./parser.md) to automatically validate and parse each batch record before processing. This ensures your record handler receives properly typed and validated data, eliminating the need for manual parsing and validation.
300300

301-
Since this is an opt-in feature, you will need to import the `parser` function from `@aws-lambda-powertools/batch/parser`, this allows us to keep the parsing logic separate from the main processing logic and avoid increasing the bundle size.
301+
To enable parser integration, import the `parser` function from `@aws-lambda-powertools/batch/parser` and pass it along with a schema when instantiating the `BatchProcessor`.
302+
303+
```typescript
304+
import { parser } from '@aws-lambda-powertools/batch/parser';
305+
```
306+
307+
You have two approaches for schema validation:
308+
309+
1. **Item schema only** (`innerSchema`) - Focus on your payload schema, we handle extending the base event structure
310+
2. **Full event schema** (`schema`) - Validate the entire event record structure with complete control
311+
312+
#### Benefits of parser integration
313+
314+
Parser integration eliminates runtime errors from malformed data and provides compile-time type safety, making your code more reliable and easier to maintain. Invalid records are automatically marked as failed and won't reach your handler, reducing defensive coding.
315+
316+
=== "Without parser integration"
317+
318+
```typescript
319+
const recordHandler = async (record: SQSRecord) => {
320+
// Manual parsing with no type safety
321+
const payload = JSON.parse(record.body); // any type
322+
console.log(payload.name); // No autocomplete, runtime errors possible
323+
};
324+
```
325+
326+
=== "With parser integration"
327+
328+
```typescript
329+
const mySchema = z.object({ name: z.string(), age: z.number() });
330+
331+
const recordHandler = async (record: ParsedRecord<SQSRecord, z.infer<typeof mySchema>>) => {
332+
// Automatic validation and strong typing
333+
console.log(record.body.name); // Full type safety and autocomplete
334+
};
335+
```
302336

303337
#### Using item schema only
304338

305-
When you only want to customize the schema of the item's payload you can pass an `innerSchema` objecta and we will use it to extend the base schema based on the `EventType` passed to the `BatchProcessor`.
339+
When you want to focus on validating your payload without dealing with the full event structure, use `innerSchema`. We automatically extend the base event schema for you, reducing boilerplate while still validating the entire record.
340+
341+
Available transformers by event type:
306342

307-
When doing this, you can also specify a `transformer` to tell us how to transform the payload before validation.
343+
| Event Type | Base Schema | Available Transformers | When to use transformer |
344+
|------------|---------------------------|------------------------------|----------------------------------------------------------------|
345+
| SQS | `SqsRecordSchema` | `json`, `base64` | `json` for stringified JSON, `base64` for encoded data |
346+
| Kinesis | `KinesisDataStreamRecord` | `base64` | Required for Kinesis data (always base64 encoded) |
347+
| DynamoDB | `DynamoDBStreamRecord` | `unmarshall` | Required to convert DynamoDB attribute values to plain objects |
308348

309-
=== "SQS - using inner payload"
349+
=== "SQS with JSON payload"
310350

311-
```typescript hl_lines="6 11-14 19 26"
351+
```typescript hl_lines="6 12-15 19-21 27-28"
312352
--8<-- "examples/snippets/batch/advanced_parser_item_sqs.ts"
313353
```
314354

315-
=== "SQS - Sample Event"
355+
=== "Sample Event"
316356

317357
```json hl_lines="6 22"
318358
--8<-- "examples/snippets/batch/samples/parser_SQS.json"
319359
```
320360

321-
The example below shows how to use the `innerSchema` with `EventType.SQS`, but you can use it with other event types as well:
322-
323-
| Event Type | Base Schema | Transformer |
324-
|------------|---------------------------|------------------------------|
325-
| SQS | `SqsRecordSchema` | `json`, `base64` (optional) |
326-
| Kinesis | `KinesisDataStreamRecord` | `base64` |
327-
| DynamoDB | `DynamoDBStreamRecord` | `unmarshall` |
328-
329-
#### Extending built-in schemas
361+
#### Using full event schema
330362

331-
When you want more control over the schema, you can extend a [built-in schema](./parser.md#built-in-schemas) for SQS, Kinesis Data Streams, or DynamoDB Streams with your own custom schema for the payload and we'll parse each item before passing it to your record handler. If the payload does not match the schema, the item will be marked as failed.
363+
For complete control over validation, extend the built-in schemas with your custom payload schema. This approach gives you full control over the entire event structure.
332364

333365
=== "SQS"
334366

335-
=== "index.ts"
336-
337-
```typescript hl_lines="6 14-16 20-23 29"
338-
--8<-- "examples/snippets/batch/advanced_parser_sqs.ts"
339-
```
367+
```typescript hl_lines="6 15-17 22-24 30-31"
368+
--8<-- "examples/snippets/batch/advanced_parser_sqs.ts"
369+
```
340370

341-
=== "Sample Event"
371+
=== "Kinesis Data Streams"
342372

343-
```json hl_lines="6 22"
344-
--8<-- "examples/snippets/batch/samples/parser_SQS.json"
345-
```
373+
```typescript hl_lines="6 18-23 28-32 39 41-44"
374+
--8<-- "examples/snippets/batch/advanced_parser_Kinesis.ts"
375+
```
346376

347377
=== "DynamoDB Streams"
348378

349-
=== "index.ts"
350-
351-
```typescript hl_lines="6 17-19 24-28 35"
352-
--8<-- "examples/snippets/batch/advanced_parser_DynamoDB.ts"
353-
```
379+
```typescript hl_lines="6 17-19 24-28 35 37"
380+
--8<-- "examples/snippets/batch/advanced_parser_DynamoDB.ts"
381+
```
354382

355-
=== "Sample Event"
383+
#### Typed record handlers with ParsedRecord
356384

357-
```json hl_lines="13-18 39-44"
358-
--8<-- "examples/snippets/batch/samples/parser_DynamoDB.json"
359-
```
385+
To get full type safety in your record handlers, use the `ParsedRecord` utility type:
360386

361-
=== "Kinesis Data Streams"
362-
363-
=== "index.ts"
387+
```typescript
388+
import type { ParsedRecord } from '@aws-lambda-powertools/batch';
364389

365-
```typescript hl_lines="6 17-22 27-31 38"
366-
--8<-- "examples/snippets/batch/advanced_parser_Kinesis.ts"
367-
```
390+
// For most cases - single schema
391+
type MyRecord = ParsedRecord<SQSRecord, z.infer<typeof mySchema>>;
368392

369-
=== "Sample Event"
393+
// For DynamoDB - separate schemas for NewImage and OldImage
394+
type MyDynamoRecord = ParsedRecord<DynamoDBRecord, z.infer<typeof newSchema>, z.infer<typeof oldSchema>>;
395+
```
370396

371-
```json hl_lines="8 24"
372-
--8<-- "examples/snippets/batch/samples/parser_Kinesis.json"
373-
```
397+
This eliminates verbose type annotations and provides clean autocompletion for your parsed data.
374398

375399
### Accessing processed messages
376400

examples/snippets/batch/advanced_parser_DynamoDB.ts

Lines changed: 3 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -4,14 +4,14 @@ import {
44
processPartialResponse,
55
} from '@aws-lambda-powertools/batch';
66
import { parser } from '@aws-lambda-powertools/batch/parser';
7+
import type { ParsedRecord } from '@aws-lambda-powertools/batch/types';
78
import { Logger } from '@aws-lambda-powertools/logger';
89
import { DynamoDBMarshalled } from '@aws-lambda-powertools/parser/helpers/dynamodb';
910
import {
1011
DynamoDBStreamChangeRecordBase,
1112
DynamoDBStreamRecord,
1213
} from '@aws-lambda-powertools/parser/schemas/dynamodb';
13-
import type { DynamoDBStreamEvent } from '@aws-lambda-powertools/parser/types';
14-
import type { DynamoDBStreamHandler } from 'aws-lambda';
14+
import type { DynamoDBRecord, DynamoDBStreamHandler } from 'aws-lambda';
1515
import { z } from 'zod';
1616

1717
const myItemSchema = DynamoDBMarshalled(
@@ -34,14 +34,7 @@ const recordHandler = async ({
3434
dynamodb: {
3535
NewImage: { name, age },
3636
},
37-
}: Omit<DynamoDBStreamEvent['Records'][number], 'dynamodb'> & {
38-
dynamodb: Omit<
39-
DynamoDBStreamEvent['Records'][number]['dynamodb'],
40-
'NewImage'
41-
> & {
42-
NewImage: z.infer<typeof myItemSchema>;
43-
};
44-
}) => {
37+
}: ParsedRecord<DynamoDBRecord, z.infer<typeof myItemSchema>>) => {
4538
logger.info(`Processing record ${eventID}`, { name, age });
4639
};
4740

examples/snippets/batch/advanced_parser_Kinesis.ts

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,7 @@ import {
44
processPartialResponse,
55
} from '@aws-lambda-powertools/batch';
66
import { parser } from '@aws-lambda-powertools/batch/parser';
7+
import type { ParsedRecord } from '@aws-lambda-powertools/batch/types';
78
import { Logger } from '@aws-lambda-powertools/logger';
89
import { Base64Encoded } from '@aws-lambda-powertools/parser/helpers';
910
import {
@@ -37,11 +38,10 @@ const recordHandler = async ({
3738
sequenceNumber,
3839
data: { name, age },
3940
},
40-
}: Omit<KinesisDataStreamRecordEvent, 'kinesis'> & {
41-
kinesis: Omit<KinesisDataStreamRecordEvent['kinesis'], 'data'> & {
42-
data: z.infer<typeof myItemSchema>;
43-
};
44-
}) => {
41+
}: ParsedRecord<
42+
KinesisDataStreamRecordEvent,
43+
z.infer<typeof myItemSchema>
44+
>) => {
4545
logger.info(`Processing record: ${sequenceNumber}`, {
4646
name,
4747
age,

examples/snippets/batch/advanced_parser_item_sqs.ts

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,7 @@ import {
44
processPartialResponse,
55
} from '@aws-lambda-powertools/batch';
66
import { parser } from '@aws-lambda-powertools/batch/parser';
7+
import type { ParsedRecord } from '@aws-lambda-powertools/batch/types';
78
import { Logger } from '@aws-lambda-powertools/logger';
89
import type { SQSHandler, SQSRecord } from 'aws-lambda';
910
import { z } from 'zod';
@@ -24,7 +25,7 @@ const processor = new BatchProcessor(EventType.SQS, {
2425
const recordHandler = async ({
2526
messageId,
2627
body: { name, age },
27-
}: SQSRecord & { body: z.infer<typeof myItemSchema> }) => {
28+
}: ParsedRecord<SQSRecord, z.infer<typeof myItemSchema>>) => {
2829
logger.info(`Processing record ${messageId}`, { name, age });
2930
};
3031

examples/snippets/batch/advanced_parser_sqs.ts

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,7 @@ import {
44
processPartialResponse,
55
} from '@aws-lambda-powertools/batch';
66
import { parser } from '@aws-lambda-powertools/batch/parser';
7+
import type { ParsedRecord } from '@aws-lambda-powertools/batch/types';
78
import { Logger } from '@aws-lambda-powertools/logger';
89
import { JSONStringified } from '@aws-lambda-powertools/parser/helpers';
910
import { SqsRecordSchema } from '@aws-lambda-powertools/parser/schemas';
@@ -27,9 +28,7 @@ const processor = new BatchProcessor(EventType.SQS, {
2728
const recordHandler = async ({
2829
messageId,
2930
body: { name, age },
30-
}: Omit<SqsRecord, 'body'> & {
31-
body: z.infer<typeof myItemSchema>;
32-
}) => {
31+
}: ParsedRecord<SqsRecord, z.infer<typeof myItemSchema>>) => {
3332
logger.info(`Processing record ${messageId}`, { name, age });
3433
};
3534

examples/snippets/batch/parser-integration/dynamoDBWithoutTransformer.ts

Lines changed: 0 additions & 32 deletions
This file was deleted.

examples/snippets/batch/parser-integration/kinesisWithoutTransformer.ts

Lines changed: 0 additions & 32 deletions
This file was deleted.

examples/snippets/batch/samples/parser_DynamoDB.json

Lines changed: 0 additions & 56 deletions
This file was deleted.

0 commit comments

Comments
 (0)