Skip to content

Commit 4e3c715

Browse files
authored
Update schemaregistry README (#109)
1 parent 7d5d246 commit 4e3c715

File tree

3 files changed

+115
-69
lines changed

3 files changed

+115
-69
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
Confluent's Javascript Client for Apache Kafka<sup>TM</sup>
1+
Confluent's JavaScript Client for Apache Kafka<sup>TM</sup>
22
=====================================================
33

44
**confluent-kafka-javascript** is Confluent's JavaScript client for [Apache Kafka](http://kafka.apache.org/) and the

examples/kafkajs/sr.js

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -74,9 +74,11 @@ const run = async () => {
7474
}
7575
)
7676

77+
// Create an Avro serializer
78+
const ser = new AvroSerializer(registry, SerdeType.VALUE, { useLatestVersion: true });
79+
7780
// Produce a message with schemaA.
7881
await producer.connect()
79-
const ser = new AvroSerializer(registry, SerdeType.VALUE, { useLatestVersion: true });
8082
const outgoingMessage = {
8183
key: 'key',
8284
value: await ser.serialize(topicName, { id: 1, b: { id: 2 } }),
@@ -89,11 +91,13 @@ const run = async () => {
8991
await producer.disconnect();
9092
producer = null;
9193

94+
// Create an Avro deserializer
95+
const deser = new AvroDeserializer(registry, SerdeType.VALUE, {});
96+
9297
await consumer.connect()
9398
await consumer.subscribe({ topic: topicName })
9499

95100
let messageRcvd = false;
96-
const deser = new AvroDeserializer(registry, SerdeType.VALUE, {});
97101
await consumer.run({
98102
eachMessage: async ({ message }) => {
99103
const decodedMessage = {

schemaregistry/README.md

Lines changed: 108 additions & 66 deletions
Original file line numberDiff line numberDiff line change
@@ -1,79 +1,121 @@
1-
Confluent's Javascript Client for Schema Registry<sup>TM</sup>
1+
Confluent's JavaScript Client for Schema Registry<sup>TM</sup>
22
=====================================================
33

4-
**confluent-kafka-javascript** includes Confluent's JavaScript client for [Schema Registry](https://docs.confluent.io/cloud/current/sr/index.html) and the accompanying package to Confluent's Javascript Client for Apache Kafka
5-
[Confluent's Javascript Client for Apache Kafka](https://www.npmjs.com/package/@confluentinc/kafka-javascript). This is an **Early Availability** library. The goal is to provide an highly performant, reliable and easy to use JavaScript client in line with other clients such as our [Go](https://github.com/confluentinc/confluent-kafka-go) and [Python](https://github.com/confluentinc/confluent-kafka-python) clients.
6-
7-
<!-- Features:
8-
- **High performance** - confluent-kafka-javascript is a lightweight wrapper around
9-
[librdkafka](https://github.com/confluentinc/librdkafka), a finely tuned C
10-
client.
11-
- **Reliability** - There are a lot of details to get right when writing an Apache Kafka
12-
client. We get them right in one place (librdkafka) and leverage this work
13-
across all of our clients.
14-
- **Supported** - Commercial support is offered by [Confluent](https://confluent.io/).
15-
- **Future proof** - Confluent, founded by the
16-
creators of Kafka, is building a [streaming platform](https://www.confluent.io/product/)
17-
with Apache Kafka at its core. It's high priority for us that client features keep
18-
pace with core Apache Kafka and components of the [Confluent Platform](https://www.confluent.io/product/).
19-
This library leverages the work and concepts from two popular Apache Kafka JavaScript clients: [node-rdkafka](https://github.com/Blizzard/node-rdkafka) and [KafkaJS](https://github.com/tulios/kafkajs). The core is heavily based on the node-rdkafka library, which uses our own [librdkafka](https://github.com/confluentinc/librdkafka) library for core client functionality. However, we leverage a promisified API and a more idiomatic interface, similar to the one in KafkaJS, making it easy for developers to migrate and adopt this client depending on the patterns and interface they prefer. We're very happy to have been able to leverage the excellent work of the many authors of these libraries!
20-
### This library is currently in limited-availability - it is supported for all usage but for the schema-registry client.
21-
To use **Schema Registry**, use the existing [kafkajs/confluent-schema-registry](https://github.com/kafkajs/confluent-schema-registry) library that is compatible with this library. For a simple schema registry example, see [sr.js](https://github.com/confluentinc/confluent-kafka-javascript/blob/dev_early_access_development_branch/examples/kafkajs/sr.js).
22-
**DISCLAIMER:** Although it is compatible with **confluent-kafka-javascript**, Confluent does not own or maintain kafkajs/confluent-schema-registry, and the use and functionality of the library should be considered "as is".
23-
## Requirements
24-
The following configurations are supported:
25-
* Any supported version of Node.js (The two LTS versions, 18 and 20, and the latest versions, 21 and 22).
26-
* Linux (x64 and arm64) - both glibc and musl/alpine.
27-
* macOS - arm64/m1.
28-
* Windows - x64.
29-
Installation on any of these platforms is meant to be seamless, without any C/C++ compilation required.
30-
In case your system configuration is not within the supported ones, [a supported version of Python](https://devguide.python.org/versions/) must be available on the system for the installation process. [This is required for the `node-gyp` build tool.](https://github.com/nodejs/node-gyp?tab=readme-ov-file#configuring-python-dependency).
4+
Confluent's JavaScript client for [Schema Registry](https://docs.confluent.io/cloud/current/sr/index.html) supports Avro, Protobuf and JSON Schema, and is designed to work with
5+
[Confluent's JavaScript Client for Apache Kafka](https://www.npmjs.com/package/@confluentinc/kafka-javascript). This is an **Early Availability** library.
6+
The goal is to provide a highly performant, reliable and easy to use JavaScript client in line with other Schema Registry clients
7+
such as our [Go](https://github.com/confluentinc/confluent-kafka-go), [.NET](https://github.com/confluentinc/confluent-kafka-dotnet),
8+
and [Java](https://github.com/confluentinc/schema-registry) clients.
9+
10+
## Installation
3111
```bash
32-
npm install @confluentinc/kafka-javascript
12+
npm install @confluentinc/schemaregistry
3313
```
34-
Yarn and pnpm support is experimental.
14+
3515
# Getting Started
36-
Below is a simple produce example for users migrating from KafkaJS.
16+
Below is a simple example of using Avro serialization with the Schema Registry client and the KafkaJS client.
3717
```javascript
38-
// require('kafkajs') is replaced with require('@confluentinc/kafka-javascript').KafkaJS.
39-
const { Kafka } = require("@confluentinc/kafka-javascript").KafkaJS;
40-
async function producerStart() {
41-
const kafka = new Kafka({
42-
kafkaJS: {
43-
brokers: ['<fill>'],
44-
ssl: true,
45-
sasl: {
46-
mechanism: 'plain',
47-
username: '<fill>',
48-
password: '<fill>',
49-
},
50-
}
51-
});
52-
const producer = kafka.producer();
53-
await producer.connect();
54-
console.log("Connected successfully");
55-
const res = []
56-
for (let i = 0; i < 50; i++) {
57-
res.push(producer.send({
58-
topic: 'test-topic',
59-
messages: [
60-
{ value: 'v222', partition: 0 },
61-
{ value: 'v11', partition: 0, key: 'x' },
62-
]
63-
}));
18+
const { Kafka } = require('@confluentinc/kafka-javascript').KafkaJS;
19+
const { SchemaRegistryClient, SerdeType, AvroSerializer, AvroDeserializer} = require('@confluentinc/schemaregistry');
20+
21+
const registry = new SchemaRegistryClient({ baseURLs: ['http://localhost:8081'] })
22+
const kafka = new Kafka({
23+
kafkaJS: {
24+
brokers: ['localhost:9092']
25+
}
26+
});
27+
28+
let consumer = kafka.consumer({
29+
kafkaJS: {
30+
groupId: "test-group",
31+
fromBeginning: true,
32+
},
33+
});
34+
let producer = kafka.producer();
35+
36+
const schema = {
37+
type: 'record',
38+
namespace: 'examples',
39+
name: 'RandomTest',
40+
fields: [
41+
{ name: 'fullName', type: 'string' }
42+
],
43+
};
44+
45+
const topicName = 'test-topic';
46+
const subjectName = topicName + '-value';
47+
48+
const run = async () => {
49+
// Register schema
50+
const id = await registry.register(
51+
subjectName,
52+
{
53+
schemaType: 'AVRO',
54+
schema: JSON.stringify(schema)
6455
}
65-
await Promise.all(res);
66-
await producer.disconnect();
67-
console.log("Disconnected successfully");
56+
)
57+
58+
// Create an Avro serializer
59+
const ser = new AvroSerializer(registry, SerdeType.VALUE, { useLatestVersion: true });
60+
61+
// Produce a message with the schema
62+
await producer.connect()
63+
const outgoingMessage = {
64+
key: 'key',
65+
value: await ser.serialize(topicName, { fullName: 'John Doe' }),
66+
}
67+
await producer.send({
68+
topic: topicName,
69+
messages: [outgoingMessage]
70+
});
71+
console.log("Producer sent its message.")
72+
await producer.disconnect();
73+
producer = null;
74+
75+
// Create an Avro deserializer
76+
const deser = new AvroDeserializer(registry, SerdeType.VALUE, {});
77+
78+
await consumer.connect()
79+
await consumer.subscribe({ topic: topicName })
80+
81+
let messageRcvd = false;
82+
await consumer.run({
83+
eachMessage: async ({ message }) => {
84+
const decodedMessage = {
85+
...message,
86+
value: await deser.deserialize(topicName, message.value)
87+
};
88+
console.log("Consumer received message.\nBefore decoding: " + JSON.stringify(message) + "\nAfter decoding: " + JSON.stringify(decodedMessage));
89+
messageRcvd = true;
90+
},
91+
});
92+
93+
// Wait around until we get a message, and then disconnect.
94+
while (!messageRcvd) {
95+
await new Promise((resolve) => setTimeout(resolve, 100));
96+
}
97+
98+
await consumer.disconnect();
99+
consumer = null;
68100
}
69-
producerStart();
101+
102+
run().catch (async e => {
103+
console.error(e);
104+
consumer && await consumer.disconnect();
105+
producer && await producer.disconnect();
106+
process.exit(1);
107+
})
70108
```
71-
1. If you're migrating from `kafkajs`, you can use the [migration guide](MIGRATION.md#kafkajs).
72-
2. If you're migrating from `node-rdkafka`, you can use the [migration guide](MIGRATION.md#node-rdkafka).
73-
3. If you're starting afresh, you can use the [quickstart guide](QUICKSTART.md).
74-
An in-depth reference may be found at [INTRODUCTION.md](INTRODUCTION.md). -->
109+
110+
## Features and Limitations
111+
- Full Avro and JSON Schema support
112+
- Protobuf support requires (upcoming) release: CP 7.4.8, 7.5.7, 7.6.4, 7.7.2, 7.8.0
113+
- Support for CSFLE (Client-Side Field Level Encryption)
114+
- Support for schema migration rules for Avro and JSON Schema
115+
- Data quality rules are not yet supported
116+
- Support for OAuth
75117

76118
## Contributing
77119

78120
Bug reports and feedback is appreciated in the form of Github Issues.
79-
For guidelines on contributing please see [CONTRIBUTING.md](CONTRIBUTING.md)
121+
For guidelines on contributing please see [CONTRIBUTING.md](CONTRIBUTING.md)

0 commit comments

Comments
 (0)