This example demonstrates how to use Kong Gateway as an event gateway with Kafka integration, including schema validation using a Schema Registry. The setup showcases both producing and consuming Kafka messages with Avro schema validation.
The example sets up a complete event-driven architecture with the following components:
- Kong Gateway: Acts as an API gateway and event gateway for Kafka operations
- Apache Kafka: Message broker for event streaming
- Schema Registry (Apicurio): Manages and validates message schemas (Confluent-compatible)
- Schema Registry UI: Web interface for managing schemas
-
Kafka (Port 9092)
- Apache Kafka 3.9.0 running in KRaft mode
- Single broker setup for development
-
Schema Registry (Port 8080)
- Apicurio Registry 3.0.9 with Confluent compatibility
- Manages Avro and JSON schemas
-
Schema Registry UI (Port 8888)
- Web interface for schema management
-
Kong Gateway (Ports 8000, 8443)
- API Gateway with Kafka plugins
- Admin API and Manager UI available
The example configures several routes with Kafka plugins:
- Route:
/kafka/schema/:topic - Plugin:
kafka-upstream - Features:
- Produces messages to specified topic with schema validation
- Schema validation using Avro schema from registry
- Synchronous message production
- Message transformation via Lua functions
-
Basic Consumer (
/kafka/rest/no-schema/:topic)- REST-based message consumption
- No schema validation
-
Schema-Validated Consumer (
/kafka/rest/schema/avro/:topic)- REST-based consumption with schema validation
- Deserializes messages using registry schemas
-
Server-Sent Events Consumer (
/kafka/sse/schema/avro/:topic)- Real-time message streaming via SSE
- Schema validation enabled
{
"type": "record",
"name": "UserRecord",
"namespace": "kong.avro",
"fields": [
{
"name": "username",
"type": "string"
},
{
"name": "age",
"type": "int"
}
]
}{
"$schema": "https://json-schema.org/draft-07/schema#",
"type": "object",
"properties": {
"username": {
"type": "string",
"minLength": 3
},
"age": {
"type": "integer",
"minimum": 0
}
},
"required": ["username", "age"]
}- Docker and Docker Compose
- Kong Enterprise license (for EE version)
docker-compose up -d- Kong Gateway: http://localhost:8000
- Schema Registry: http://localhost:8080
- Schema Registry UI: http://localhost:8888
First, register the Avro schema in the schema registry:
curl -X POST http://localhost:8080/apis/ccompat/v7/subjects/user-value/versions \
-H "Content-Type: application/vnd.schemaregistry.v1+json" \
-d '{
"schema": "{\"type\":\"record\",\"name\":\"UserRecord\",\"namespace\":\"kong.avro\",\"fields\":[{\"name\":\"username\",\"type\":\"string\"},{\"name\":\"age\",\"type\":\"int\"}]}"
}'Send a message through Kong to Kafka with schema validation:
curl -X POST http://localhost:8000/kafka/schema/my-topic \
-H "Content-Type: application/json" \
-d '{
"username": "john_doe",
"age": 30
}'curl http://localhost:8000/kafka/rest/no-schema/my-topiccurl http://localhost:8000/kafka/rest/schema/avro/my-topiccurl -N http://localhost:8000/kafka/sse/schema/avro/my-topicKong also supports JSON Schema validation as an alternative to Avro. This section demonstrates how to use JSON Schema for message validation.
Register the JSON schema in the schema registry:
curl -X POST http://localhost:8080/apis/ccompat/v7/subjects/user-json-value/versions \
-H "Content-Type: application/vnd.schemaregistry.v1+json" \
-d '{
"schema": "{\"$schema\":\"https://json-schema.org/draft-07/schema#\",\"type\":\"object\",\"properties\":{\"username\":{\"type\":\"string\",\"minLength\":3},\"age\":{\"type\":\"integer\",\"minimum\":0}},\"required\":[\"username\",\"age\"]}"
}'Send a message through Kong to Kafka with JSON schema validation:
curl -X POST http://localhost:8000/kafka/schema/my-topic \
-H "Content-Type: application/json" \
-d '{
"username": "jane_doe",
"age": 28
}'Note: Messages must comply with the JSON schema constraints:
usernamemust be a string with minimum length of 3 charactersagemust be an integer with minimum value of 0- Both fields are required
curl http://localhost:8000/kafka/rest/schema/json/my-topicdocker-compose.yaml: Open source Kong setupdocker-compose.ee.yaml: Enterprise Kong setupkong-config/kong.yaml: Kong declarative configurationkonnect.env: Kong Konnect data plane configurationee.env: Kong Enterprise license configuration
- Schema Validation: Automatic validation of Kafka messages using Avro schemas
- Multiple Consumer Patterns: REST and Server-Sent Events consumption
- Schema Registry Integration: Confluent-compatible schema management
- Message Transformation: Lua-based message processing
- Enterprise Features: Kong Manager UI and advanced plugins (EE version)
- Schema Registry Connection: Ensure schema registry is running before Kong
- Topic Creation: Kafka auto-creates topics, but you can pre-create them if needed
- Schema Registration: Register schemas before producing messages with validation
# View Kong logs
docker logs kong-docker
# View Kafka logs
docker logs kafka
# View Schema Registry logs
docker logs schema-registrydocker-compose down -vThis example provides a foundation for building event-driven architectures with Kong Gateway, demonstrating schema validation, multiple consumption patterns, and integration with modern event streaming platforms.