|
| 1 | +--- |
| 2 | +title: "New workshop: Process a CSV file" |
| 3 | +date: 2026-02-04 |
| 4 | +summary: Learn to build a complete data processing pipeline with our new intermediate workshop |
| 5 | +authors: |
| 6 | + - pvinaches |
| 7 | +tags: |
| 8 | + - Kaoto Workshop |
| 9 | + - CSV Processing |
| 10 | + - Enterprise Integration Patterns |
| 11 | + - Kafka |
| 12 | + - PostgreSQL |
| 13 | + - Data Pipeline |
| 14 | + - Intermediate |
| 15 | +--- |
| 16 | + |
| 17 | +We're excited to announce a new **intermediate-level workshop** that teaches you how to build a complete data processing pipeline using **Kaoto's visual designer** and **Apache Camel**! |
| 18 | + |
| 19 | +## What's the Workshop About? |
| 20 | + |
| 21 | +The [**Process a CSV file**](/workshop/intermediate-csv-processor) workshop guides you through building a real-world integration pipeline that processes healthcare patient data. You'll work with actual data from the Synthea patient dataset and learn how to combine multiple **Enterprise Integration Patterns (EIPs)** to create a robust, production-ready system. |
| 22 | + |
| 23 | +## What You'll Build |
| 24 | + |
| 25 | +A five-route integration pipeline that demonstrates: |
| 26 | + |
| 27 | +- **CSV Ingestion** - Reading and parsing CSV files with automatic archiving |
| 28 | +- **Data Validation** - Content-based routing to filter records by data quality |
| 29 | +- **Database Integration** - Persisting valid records to PostgreSQL |
| 30 | +- **Message Publishing** - Real-time monitoring with Apache Kafka |
| 31 | +- **Error Handling** - Capturing and managing invalid records for review |
| 32 | + |
| 33 | +The complete pipeline reads patient data from a CSV file, validates each record, stores valid data in a database, publishes to Kafka for monitoring, and captures invalid records in error files for later correction. |
| 34 | + |
| 35 | +## What You'll Learn |
| 36 | + |
| 37 | +This hands-on workshop covers essential integration patterns and technologies: |
| 38 | + |
| 39 | +- **File polling and CSV data ingestion** - Automated file processing with idempotent consumers |
| 40 | +- **Content-based routing** - Intelligent message routing based on data validation rules |
| 41 | +- **Database integration** - Working with PostgreSQL using Camel Kamelets |
| 42 | +- **Message publishing** - Publishing events to Apache Kafka topics |
| 43 | +- **Error handling and data quality management** - Building resilient pipelines with proper error capture |
| 44 | + |
| 45 | +## Who Is This For? |
| 46 | + |
| 47 | +This is an **intermediate-level workshop** designed for developers who: |
| 48 | + |
| 49 | +- Have basic understanding of integration concepts and data processing pipelines |
| 50 | +- Are familiar with VSCode and command-line tools |
| 51 | +- Want to learn how to build production-ready integration solutions |
| 52 | +- Are interested in visual low-code development with Kaoto |
| 53 | + |
| 54 | +## Let's Build it Together |
| 55 | + |
| 56 | +Let us know what you think by joining us in the [GitHub discussions](https://github.com/orgs/KaotoIO/discussions). |
| 57 | +Do you have an idea how to improve Kaoto? Would you love to see a useful feature implemented or simply ask a question? Please [create an issue](https://github.com/KaotoIO/kaoto/issues/new/choose). |
| 58 | + |
| 59 | +## Get Started |
| 60 | + |
| 61 | +* **Workshop**: [Process a CSV file](/workshop/intermediate-csv-processor) |
| 62 | +* **Demo code**: [GitHub repository](https://github.com/KaotoIO/kaoto-examples/tree/main/csv-processor) |
| 63 | +* **Kaoto quickstart**: [Getting started guide](/docs/quickstart/) |
| 64 | +* **VS Code extension**: [Install from marketplace](https://marketplace.visualstudio.com/items?itemName=redhat.vscode-kaoto) |
| 65 | +* **Try Kaoto online**: [Showcase deployment](https://red.ht/kaoto) |
| 66 | + |
| 67 | +Happy integrating! 🚀 |
0 commit comments