|
| 1 | +# Data Converter Sample |
| 2 | + |
| 3 | +This sample workflow demonstrates how to use custom data converters in Cadence workflows with compression capabilities. The data converter is responsible for serializing and deserializing workflow inputs, outputs, and activity parameters, with the added benefit of data compression to save storage space and bandwidth. |
| 4 | + |
| 5 | +## Sample Description |
| 6 | + |
| 7 | +The sample implements a custom compressed JSON data converter that: |
| 8 | +- Serializes workflow inputs and activity parameters to JSON format |
| 9 | +- Compresses the JSON data using gzip compression to reduce size |
| 10 | +- Decompresses and deserializes workflow outputs and activity results from JSON format |
| 11 | +- Provides significant storage and bandwidth savings for large payloads |
| 12 | +- Demonstrates advanced data converter patterns for production use cases |
| 13 | +- Shows real-time compression statistics and size comparisons |
| 14 | + |
| 15 | +The sample includes two workflows: |
| 16 | +1. **Simple Workflow**: Processes a basic `MyPayload` struct |
| 17 | +2. **Large Payload Workflow**: Processes a complex `LargePayload` with nested objects, arrays, and extensive data to demonstrate compression benefits |
| 18 | + |
| 19 | +All data is automatically compressed during serialization and decompressed during deserialization, with compression statistics displayed at runtime. |
| 20 | + |
| 21 | +## Key Components |
| 22 | + |
| 23 | +- **Custom Data Converter**: `compressedJSONDataConverter` implements the `encoded.DataConverter` interface with gzip compression |
| 24 | +- **Simple Workflow**: `dataConverterWorkflow` demonstrates basic payload processing with compression |
| 25 | +- **Large Payload Workflow**: `largeDataConverterWorkflow` demonstrates processing complex data structures with compression |
| 26 | +- **Activities**: `dataConverterActivity` and `largeDataConverterActivity` process different payload types |
| 27 | +- **Large Payload Generator**: `CreateLargePayload()` creates realistic complex data for compression demonstration |
| 28 | +- **Compression Statistics**: `GetPayloadSizeInfo()` shows before/after compression metrics |
| 29 | +- **Tests**: Includes unit tests for both simple and large payload workflows |
| 30 | +- **Compression**: Automatic gzip compression/decompression for all workflow data |
| 31 | + |
| 32 | +## Steps to Run Sample |
| 33 | + |
| 34 | +1. You need a cadence service running. See details in cmd/samples/README.md |
| 35 | + |
| 36 | +2. Run the following command to start the worker: |
| 37 | + ``` |
| 38 | + ./bin/dataconverter -m worker |
| 39 | + ``` |
| 40 | + |
| 41 | +3. Run the following command to execute the workflow: |
| 42 | + ``` |
| 43 | + ./bin/dataconverter -m trigger |
| 44 | + ``` |
| 45 | + |
| 46 | +You should see: |
| 47 | +- Compression statistics showing original vs compressed data sizes |
| 48 | +- Workflow logs showing the processing of large payloads |
| 49 | +- Activity execution logs with payload information |
| 50 | +- Final workflow completion with compression benefits noted |
| 51 | + |
| 52 | +## Customization |
| 53 | + |
| 54 | +To implement your own data converter with compression or other features: |
| 55 | +1. Create a struct that implements the `encoded.DataConverter` interface |
| 56 | +2. Implement the `ToData` method for serialization and compression |
| 57 | +3. Implement the `FromData` method for decompression and deserialization |
| 58 | +4. Register the converter in the worker options |
| 59 | + |
| 60 | +This pattern is useful when you need to: |
| 61 | +- Reduce storage costs and bandwidth usage with compression |
| 62 | +- Use specific serialization formats for performance or compatibility |
| 63 | +- Add encryption/decryption to workflow data |
| 64 | +- Implement custom compression algorithms (LZ4, Snappy, etc.) |
| 65 | +- Support legacy data formats |
| 66 | +- Add data validation or transformation during serialization |
| 67 | + |
| 68 | +## Performance Benefits |
| 69 | + |
| 70 | +The compressed data converter provides: |
| 71 | +- **Storage Savings**: Typically 60-80% reduction in data size for JSON payloads |
| 72 | +- **Bandwidth Reduction**: Lower network transfer costs and faster data transmission |
| 73 | +- **Cost Optimization**: Reduced storage costs in Cadence history |
| 74 | +- **Scalability**: Better performance with large payloads |
0 commit comments