Interact with Apache Kafka® directly in VS Code. Manage clusters, produce and consume messages, and explore topics.
Browse your Kafka infrastructure in the sidebar:
- 📊 Clusters - Connect to multiple Kafka clusters simultaneously
- 📂 Topics - View, create, and delete topics with real-time updates
- 🖥️ Brokers - Monitor broker health and configuration
- 👥 Consumer Groups - Track consumer lag and group membership
- ⚙️ Configurations - Inspect and manage cluster settings
- 🧬 Schema Registries - Browse subjects and schema versions linked to your clusters
Create producers using simple .kafka files with rich features:
- 🎲 Randomized Data - Generate test data with Faker.js templates
- 🔑 Headers & Keys - Full support for message keys and custom headers
- ⏱️ Scheduled Production - Produce messages at regular intervals (
every: 5s,every: 1m) - 🔁 Batch Production - Send multiple messages at once for load testing
- 🎯 Multiple Producers - Define multiple producers in a single file
- ✅ JSON Schema Validation - Validate
value-format: jsonpayloads with inline schema orfile(...)reference - 🧬 Avro & Protobuf - Produce/consume Avro (
value-schemainline or file) and Protobuf (value-schemafile only)
Example:
PRODUCER user-events
topic: user-activity
every: 3s
key: user-{{string.uuid}}
headers: source=web-app, version=1.0
{
"userId": "{{string.uuid}}",
"event": "{{helpers.arrayElement(['login', 'logout', 'purchase'])}}",
"timestamp": {{$timestamp}},
"user": {
"name": "{{person.fullName}}",
"email": "{{internet.email}}"
}
}
Value format examples (JSON / Avro / Protobuf):
PRODUCER
topic: json-events
value-format: json
value-schema: {"type":"object","required":["id"],"properties":{"id":{"type":"number"}}}
{"id":1}
###
PRODUCER
topic: avro-events
value-format: avro
value-schema: {"type":"record","name":"UserEvent","fields":[{"name":"id","type":"int"}]}
{"id":1}
###
PRODUCER
topic: protobuf-events
value-format: protobuf(demo.UserCreated)
value-schema: file(./schemas/user-events.proto)
{"id":1,"email":"jane@example.com","active":true}
./schemas/user-events.proto used by protobuf(demo.UserCreated):
syntax = "proto3";
package demo;
message UserCreated {
int32 id = 1;
string email = 2;
bool active = 3;
}Consume messages with the Message Viewer:
- 📊 Message Viewer - Table view with search, filters, histogram, and CSV export
- 🎯 Targeted Consumption - Consume from specific partitions or offsets
- 💾 Export Data - Export consumed messages to CSV for analysis
Start consuming from:
- Right-click a topic in the explorer
- Use Command Palette (
Ctrl+Shift+P) - Define consumers in
.kafkafiles
CONSUMER analytics-team
topic: user-events
from: earliest
partitions: 0,1,2
Consume Avro values with schema:
CONSUMER analytics-team
topic: user-events
from: earliest
value-format: avro
value-schema: file(./schemas/user-event.avsc)
Consume Protobuf values with schema file:
CONSUMER analytics-team
topic: protobuf-events
from: earliest
value-format: protobuf(demo.UserCreated)
value-schema: file(./schemas/user-events.proto)
- 🔒 SASL Authentication - PLAIN, SCRAM-256, SCRAM-512 (Kafka 0.10+)
- 🌐 OAUTHBEARER - OAuth 2.0 authentication with automatic token refresh
- ☁️ AWS MSK IAM - Native AWS IAM authentication for Amazon MSK clusters
- 🛡️ SSL/TLS Support - Secure connections with certificate validation
- 🔑 Secure Storage - Credentials stored in OS keychain (macOS Keychain, Windows Credential Manager, Linux Secret Service)
- 🧪 Development Mode - Optional hostname verification bypass for self-signed certificates
Manage Confluent-compatible Schema Registries directly from VS Code:
- 🔗 Reusable Connections - Define named Schema Registry connections independent of clusters
- 🔗 Cluster Linking - Link a registry to one or more Kafka clusters
- 📂 Explorer Browsing - Browse subjects and schema versions in the sidebar
- 📄 Open & Compare - Open schema versions in the editor and diff any two versions side-by-side
- 🔍 Topic Subject Discovery - Automatically discover subjects related to a topic based on TopicNameStrategy, RecordNameStrategy, or TopicRecordNameStrategy
- 🔒 Auth & TLS - Optional basic auth and custom TLS configuration per registry
- ✅ Create Topics - Configure partitions, replication factor, and topic settings
- 🗑️ Delete Topics - Remove unwanted topics with confirmation dialogs
- 🧹 Delete Records - Empty topics by deleting all messages from all partitions
- 📋 Metadata Inspection - Dump detailed metadata for clusters, brokers, and topics
- 👥 Consumer Group Management - Delete consumer groups and monitor offsets
Search for "Kafka" in the VS Code Extensions marketplace or install from here.
Click the + icon in the Kafka Explorer or use Ctrl+Shift+P → "Kafka: Add Cluster"
- Browse topics and consumer groups
- Right-click to produce or consume messages
- Create
.kafkafiles for reusable workflows
📚 Need Help? Open documentation inside VS Code with
Ctrl+Shift+P→ "Kafka: Open Documentation"
| Topic | Description |
|---|---|
| Kafka Explorer | Navigating clusters, topics, brokers, and consumer groups |
| Producing Messages | Creating producers with Faker templates and scheduled production |
| Consuming Messages | Message Viewer and consumption options |
| .kafka File Format | Syntax reference for producer and consumer definitions |
| Settings | Extension configuration options |
Extend the Kafka explorer by creating custom cluster providers. Your extension can:
- Discover clusters from external sources (cloud providers, configuration management)
- Auto-configure connection settings
- Provide custom authentication mechanisms
Create a Cluster Provider Extension:
- Add
"kafka-provider"to your extension'spackage.jsonkeywords - Implement the cluster provider API
- Users discover your extension via "Discover Cluster Providers"
We ❤️ contributions! Whether you're:
- 🐛 Reporting bugs
- 💡 Suggesting features
- 📝 Improving documentation
- 💻 Submitting pull requests
All contributions are welcome! See CONTRIBUTING.md for guidelines.
- Clone the repository
- Run
npm install - Open in VS Code and press
F5to launch Extension Development Host - Make your changes and run tests with
npm test
Try the latest development version:
- Go to the CI Workflow page
- Click on the most recent successful run
- Download the
vscode-kafkaartifact - Unzip and install the
.vsixfile:code --install-extension vscode-kafka-*.vsix
MIT License. See LICENSE file.
Apache, Apache Kafka®, Kafka® and associated logos are trademarks of the Apache Software Foundation (ASF). Tools for Apache Kafka® is not affiliated with, endorsed by, or otherwise associated with the Apache Software Foundation or any of its projects.




