You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This sample workflow demonstrates how to use custom data converters in Cadence workflows. The data converter is responsible for serializing and deserializing workflow inputs, outputs, and activity parameters.
3
+
This sample workflow demonstrates how to use custom data converters in Cadence workflows with compression capabilities. The data converter is responsible for serializing and deserializing workflow inputs, outputs, and activity parameters, with the added benefit of data compression to save storage space and bandwidth.
4
4
5
5
## Sample Description
6
6
7
-
The sample implements a custom JSON data converter that:
7
+
The sample implements a custom compressed JSON data converter that:
8
8
- Serializes workflow inputs and activity parameters to JSON format
9
-
- Deserializes workflow outputs and activity results from JSON format
10
-
- Provides better control over data serialization compared to the default data converter
11
-
- Can be extended to support custom serialization formats (e.g., Protocol Buffers, MessagePack)
9
+
- Compresses the JSON data using gzip compression to reduce size
10
+
- Decompresses and deserializes workflow outputs and activity results from JSON format
11
+
- Provides significant storage and bandwidth savings for large payloads
12
+
- Demonstrates advanced data converter patterns for production use cases
13
+
- Shows real-time compression statistics and size comparisons
12
14
13
-
The workflow takes a `MyPayload` struct as input, processes it through an activity, and returns the modified payload.
15
+
The sample includes two workflows:
16
+
1.**Simple Workflow**: Processes a basic `MyPayload` struct
17
+
2.**Large Payload Workflow**: Processes a complex `LargePayload` with nested objects, arrays, and extensive data to demonstrate compression benefits
18
+
19
+
All data is automatically compressed during serialization and decompressed during deserialization, with compression statistics displayed at runtime.
14
20
15
21
## Key Components
16
22
17
-
-**Custom Data Converter**: `jsonDataConverter` implements the `encoded.DataConverter` interface
18
-
-**Workflow**: `dataConverterWorkflow` demonstrates using custom data types with the converter
19
-
-**Activity**: `dataConverterActivity` processes the input and returns modified data
20
-
-**Test**: Includes unit tests to verify the data converter functionality
23
+
-**Custom Data Converter**: `compressedJSONDataConverter` implements the `encoded.DataConverter` interface with gzip compression
24
+
-**Simple Workflow**: `dataConverterWorkflow` demonstrates basic payload processing with compression
25
+
-**Large Payload Workflow**: `largeDataConverterWorkflow` demonstrates processing complex data structures with compression
26
+
-**Activities**: `dataConverterActivity` and `largeDataConverterActivity` process different payload types
27
+
-**Large Payload Generator**: `CreateLargePayload()` creates realistic complex data for compression demonstration
0 commit comments